Lpninjas Blog

Blog for #100DaysOfCode where Day #: is for coding challenge and R&D: is for research.

Follow me on twitter @solarengineer. Where I tweet about coding, solar energy and IoT.

Day 1: Chose Freecodecamp and figured out I will do the backend course. Setup Git, Git Desktop, and Git pages.

Day 2: Started with editing package.json on a backend challenge. I still can't figure out how to submit the challenge correctly.

Day 3: Got my github.io stack up and continued to write here. Still stuck on their submission form. Have asked them. Smeagol responded. Apparently you cannot use static pages like Git Pages and need to use what they have suggested (Replit) to run & submit your code. I switched to Replit and it worked instantly. Working through the problems now updating package.json and adding description, author, keywords, license, version.

Day 4: Working through dependencies in package.json. Replit working well. People asked me why I am doing the #100DaysOfCode challenge? I am wanting to improve my backend skills and I am really curious.

Day 5: About 50% through the first part of the course. Dependencies again. Quote of the day - "It is not time to wake up the sheep, it is time to wake up the Lions!

Day 6: Working on some new CSS syntax. Interestingly CSS has the ability to nearly make any shape or flat object. Snowboarding.

Day 7: Stuck again and I can’t figure out if it is syntax or logical. Some of these dependency libraries are possibly obscure. I will check tomorrow. Blizzard running.

Day 8: Got unstuck, was a silly syntax issue. Onwards. We have a serious blizzard ongoing here so coding is great. Watch out for falling snow (from rooftops).

Day 9: Looking at semantic versioning. Luckily I already had gone over this a while back but it is great to refresh. Versioned a dependency in a package.json. Helped dig a random person's car out. #hokkaidolife

Day 10: Infrastructure failure day and wasn't able to charge my laptop properly. HTML only today. #hokkaidolife

Day 11: Dependencies not being recognised in the solution checker for my repo. Have asked them why.

Day 12: Found the error in my dependencies versioning. I was using "^2.10.2" instead of "2.10.2". e.g. Setting it to ^2.10.2 is not the same thing as setting it to 2.10.2. It is the difference between saying, “I want a scoop of vanilla, but if there are some sprinkles or whatever, that’s cool, as long as there’s a scoop of vanilla in there.” and saying, “I want a scoop of vanilla. Period.” Also figured out the use of the tilda character "~" to update to the latest version of a package. e.g. 2.10.x. Using the caret "^" character allows for both 2.x.x updates in semantic versioning. Now completed this section and up to the Node console. console.log "Hello World" was written.

Day 13: Node console and express. Writing some of my first javascript and looking at function syntax.

Day 14: Functions in javascript. Snow walking. #hokkaidolife

Day 15: console.log(variable); is working well. Paths and GET requests.

Day 16: Serving Static Assets. Building some test apps by serving HTML files. Looking at middleware next.

Day 17: Successfully connected to Express (middleware) and served the HTML file. Now the HTML file references a CSS style file and now my login looks more stylish! Next steps would to be start learning the database connectors. I currently like the MERN stack. (MongoDB, Express, React, NodeJS). We finally got to over 0 degrees here in Sapporo. #Hokkaidolife.

Day 18: {"message":"Eat Bananas and Hello json"}- wow this updates fast. I am starting to play with JSON strings from myapp.js.

Day 19: I am actually on a hardware and SD card failure mode easter egg hunt today so not much coding other than html. I also came across the possibility to keep IOT boot files on a server and do power cycling over POE but I haven't delved into it yet.

Day 20: Stuck again on this .env setup. Working through it.

Day 21: Still stuck, no coding.

Day 22: Kind of less stuck, figure out how to transform strings with the operator .toUpperCase(); Replit and secrets tab (system environmental variables) functioning.

Day 23: Doldrums, wow, can't seem to get my .env variable to change my JSON output. Is there a great bug finder editor for #javascript ?

Day 24: .env and I need to get better at sequential coding and local variables.

Day 25: Got some good feedback and onto the next challenge.

Day 26: Setting up NAT traversal. Stuck on the next challenge.

Day 27: Missed the previous two days completely as I was recovering. Looking at the next challenge.

Day 28: Implement a Root-Level Request Logger Middleware. Trying out this challenge.

Day 29: Looking at this Express middleware challenge and not exactly sure why I would need it. I guess if there is something that needs to interupt a response pattern and execute conditionally. Like log GET, DELETE or PUT requests.

Day 30: Looking at loops FOR & WHILE today.

Day 31: Authorization using Middleware functions.

const authorizeRoute = (req,res,next) => {}

Day 32: While loops. Seems very similar actually to what I wrote before in Java. e.g.

let n = 0; while (n < 3) { n++; }
Snowboarding. It is getting to be getting late season now, but, the snow was still great.

Day 33: Back to .env variables.

Day 34: Realized I was coding wrong with the function "Let" and this was some good progress.

Day 35: Been struggling with an IF loop outcome lately. I am still learning a lot of the basic aspects of JavaScript.

Day 36: I seriously get stuck a lot. So I have finally figured out CONST and LET with the assignment of variables in scopes. So for example if I declare a CONST outside the curly brackets {} that CONST is usually meant to be a constant. Then I can declare the same CONST value inside the curly brackets {} and if I console.log inside the curly brackets I will get the new CONST value as the output and not the original CONST value that I originally declared. Now if I use LET instead of CONST then I can say that LET is redefining a constant.

R&D: I found another interesting thread where a Bitcoin maxi describes the necessity of Proof-of-Work and there are some legitimate points here. However this is a huge topic and it is multi-disciplinary, so I will document my results in this blog.

Day 37: Shipping out Datalogger products today to a customer so not much coding other than making my blog look more presentable.

Day 38: Ok, so I am still stuck on this env variable checker function using SECRETS in Replit. I understand that it works differently in a normal setup when I can save my env files locally vs. using it in Replit. But I still think that there is some kind of problem with my actual code inside the handler function itself. Essentially by placing the line of code where you check the value of the environment variable outside of the handler function means that the environment variable is checked only once, when the application first starts. However, when you check the value of the environment variable inside the handler function, then you check the value of the environment variable every time the handler function is called. This difference is important because the test suite sends a command to the application to change the environment variable while the application is running. If you do not check the value of the environment variable inside of the handler, then it is literally impossible for the handler to respond to this change. (This is the part that got me...) Manually changing the environment variable involves restarting the application, which isn’t how the test suite works. It also isn’t how you would want a real world application to actually work. You shouldn’t need to restart the application to change this behavior.

R&D: Great discussion about DAO's here. Worth a listen if you want to learn more about DAO's. One reminder, without the Proof-of-Work blockchain of Bitcoin backstopping the entire system, largely the multi-chain world would have a hard time functioning. Remember, Ethereum itself was started by a crowdraise from Bitcoin. That said, the multi-chain world we currently see may already be too big to fail but nobody knows for sure.

Day 39: I upgraded my setup here to VS Code and it is literally an amazing editor and studio. I was frustrated with the way that Replit treats .env files with their SECRETS tab. I realised that if I am going to work on more complex server projects I should just run things locally here with terminal and the correct dependencies and environment files.

R&D: Blockchains as tools to represent more stakeholders in the global community state machine. The talk from Ethan Buchman on Cosmos, Tendermint and sovereignty within crypto is here. I am also in agreement with Ethan about Bitcoin with respect to its role as a thermodynamic anchor using Proof-of-work. I would prefer to tackle the problems that we are facing on the planet with Bitcoin at our back vs. not having it. That said, there are also interesting consensus algorithms that are being used in Cosmos and others. Tendermint is the operating system that runs the low level blockchain. Cosmos SDK is the toolkit that helps developers build applications on top of Tendermint and finally the inter blockchain communication protocol (IBC) is a way to connect to other blockchains.

Day 40: Recalibrated my GIT setup on another computer. I need to dive into the next challenge asap.

Day 41: Middleware with request, response and next. It is quickly occuring to me that middleware is a very powerful step in handling logins and all sorts of AUTH actions and logging. A great resource for express middleware is here.

Day 42: Ok back at it after an extended break. Looking at some amazing implementations using Node.js. A good resource is here.

Day 43: Simple middleware function without a mount path. e.g

 app.use((req, res, next) => {console.log('Time:', Date.now()) next()})

Day 44: Loading a series of middleware functions at a mount point, with a mount path. It illustrates a middleware sub-stack that prints request info for any type of HTTP request to the /user/:id path. Kind of cool that you can specify the mount point.

Day 45: Got my loggers going on console.log with middleware. Now trying to see if this is pushing to JSON also with res.json.

Day 46: I was able to use the res.json output to form a JSON message of the outputs of my middleware. This was an interesting problem to solve.

R&D: Looking into the Ethereum Proof-of-stake here. This may take a while.

Day 47: I started chaining middleware to reuse code in different places. This approach can also be used to perform some validation on the data. At each point of the middleware stack you can block the execution of the current chain and pass control to functions specifically designed to handle errors.

Day 48: Hitting a folder that has a res.json as an echo server. This is quite neat in my experience and it seems powerful because you can set any folder/ combination. I guess the next step would be to add UserId and some other database parameters related to that UserId.

Day 49: I finally managed to build an API endpoint mounted at GET /name. It responds with a JSON document, taking the structure { name: 'firstname lastname'} It has taken me a while to get to this stage in order to make an API endpoint. After I setup databases in the next part of this course, then, things will start to get interesting.

R&D: Continuation on my original research into COSMOS. e.g. the Empire model vs. the City State model here. Ultimately after hearing both arguments it raised more questions than answers. A few points that I wrote down. Nobody really knows what the definition of money is. Value capture in a movement of the velocity of obligations is one side of the scale, the other side being a pet rock like (Gold) that is deflationary. The unit of account problem space that Buchman mentions is another strand of research and questioning that hasn't even begun. Proof-of-stake isn't the final solution, and continually needs to be improved. Ethereum will also fall into the same problems that other proof-of-stake chains had, but on the other hand for blockchains to innovate proof-of-stake is needed as well. Proof-of-work is still really important to anchor thermodynamically to the real world. In the end I am more into the bottom up emergent thesis that Buchman is proposing with COSMOS.

Day 50: Halfway through the 100 day challenge. Ok, so I am back to some coding with middleware functions. The NodeJS program is basically like a conveyorbelt with a number of opertions that can happen along the way. In the middleware part of the conveyor, the apt.use() function asigns what is happening with certain operations. In the case today, we are directing how to parse a payload that is associated with a POST operator in NodeJS. I installed it at the top of the file in express as a variable called var bodyParser = require("body-parser"); then, I called the variable in the apt.use function for bodyParser.urlencoded. Later I used the same function on the JSON payload to parse it.

Day 51: I completed the Freecodecamp Basic Node and Express course. My next step is to jump over to MongoDB university and complete an intro course over there.

Day 52: Back at it, looking at setting up MongoDB.

Day 53: Looking at TYPES of setup configuration.

Day 54: Finished my setup of my own self-hosted rocket.chat server on my bare metal box. I am still having issues with the caddy snaps service. Although it is a practical project that I have completed now. One of the things holding me up was the fact that my old router didn't have have port forwarding. This new router is amazing.

Day 55: Ok, I messed around with caddy service in the rocketchat app. It is a way to encapsulate specific http:// ports into a TLS certificate so that you can have some more peace of mind that your self-hosted rocketchat chat app is behind https://. So unfortunately it may be the fact that my bare metal architecture might not support the latest snap version of this. Investigating.

Day 56: Deep backend stuff today. I am still wondering if you build a stack in a DEV environment and break it intentionally as you scale or you do something smarter like elastic scaling. Server stack design in Linode based on perceived requirements of RAM and dedicated no. of CPUs.

Day 57: I am welcomed into the ESRI developer universe to solve all of my reverse geocoding problems with a simple line of code? I am probably jumping into a rabbit hole. On the other side, there are so many options with ARCGIS and all of these location services, basemaps and layers. It looks like a powerful toolkit.

Day 58: Setup MongoDB test on a Linode nanode. e.g. Setup. Alternatively there is another way to provision a database directly in Linode. I am exploring those options.

Day 59: I am testing in PROD! Well, not really, this is an experiment to run my NPM using Netlify. It is kind of exciting to see what the next steps are. Figuring out this framework has been enjoyable. I have deployed the login page but I have not connected any MongoDB backend. This would naturally be the next step in the process. #trusttheprocess

Day 60: Made a dbConnect.js script that uses Mongoose and declares a constant MONGODB_URI in .env to connect to Mongodb.

R&D: I went back into POS vs. PoW research. There was a POS paper by the Cardano team stating that they have produced a provably secure proof-of-stake blockchain protocol and they named it "Ouroboros". This was published in 2019. I will link it here. After I digest it some more I will comment.

Day 61: I added a new app that is a login framework for a secret page using Node.js and the expressjs();. This then references a MongoDB.

Day 62: I was able to setup my external environment to code directly onto my Linode that was setup via external ssh login. The Linode is provisioned in Tokyo and there is literally no lag. I will be making some experimental apps on it in the next week. As an aside, it is getting super cold here #Hokkaidolife, probably will snow this week.

Day 63: I spent the day installing my npm package for my loginframework that I have created/modified. I was able to copy a standard setup by another user. I am polishing it to work with Ubuntu 22.04 and the libssl1.1 needed to be updated and installed separately. I did this by installing mongodb for Ubuntu 20.04, then adding some additional elements.

Day 64: Launched the login page. NodeJS, ExpressJS, MongoDB. Pls note, nothing to log into yet.

Day 65: I found an interesting plugin called qgis2web that generates a web map from your current QGIS project. This is either as OpenLayers, Leaflet, or Mapbox GL JS. It replicates as many aspects of the project as it can, including layers, styles (including categorized and graduated), and extent. I am excited to try this as I would make the html available on my server behind the login page that I have made throughout the last 64 days of coding. Let's try this.

Day 66: I got comfortable with containerisation and using Dockerfile. I containerised their test app for a to-do list on my Linode server. Also when installing MongoDB on different versions of Ubuntu, there are different dependencies to take care of. I am currently comfortable adding Mongo DB to Ubuntu 20.04. Check out my app here.

Day 67: Setting up my .env carefully for a backend that was created. Cloning repo to my Linux box Linode server.

Day 68: Working with this python code that was written (not by me) to convert .xlsx files into a MongoDB. This also references an API that does reverse geocoding. I should have worked with the original .xlsx files that this code was written for and not the converted files.

Day 69: I was able to get the geocoding working with the python scripts. I did a google geocoding API call and was able to get 4600+ addresses geocoded to lat and long in about 15 minutes in the terminal. It also ingested into my MongoDB.

Day 70: I Setup python pandas on jupyter notebook. Very swish stuff. You can even export your notebooks in html!

Day 71: I decided to spend 20 min and learn some CSS. It is quite logical and there are a heap of css generators out there for what you want. Here is the example.

Day 72: In deplopyment of the first alpha version of my app. The deplopyment isn't going too well. Just working through a lot of npm updates. It seems that these packages are changing and getting updated almost daily. Forget about a version of a package that is over 1-year old. That is way out of date.

Day 73: Today I want to transfer my MongoDB which is running inside Docker to the MongoDB that is running on my base Linode environment. This seems to be more of an in depth task than originally thought.

Day 74: I figured out how to at least access MongoDB inside a running Docker and copy some files out. Also I have started another repo that connects to an email hook. So you can signup with email verification. Simple but important I guess. It is peak snow season here #hokkaidolife.

Day 75: Migrating my mongodb from inside a Docker container was a necessity because over time my database was becoming more valuable. If I shut down the Docker container then the actual database inside that container will also be lost. I had to dump the database that had been built up inside that container, and then restore it in an actual mongodb that was running natively on the base Linode that I had created. Ok, so after the initial attempts we were able to nuke the mongodb inside the Linode with some permission problems! So then I created a completely new Linode with a Mongodb inbuilt (which is actually quite helpful). After that was running and actually authenticated, I was able to scp my data from the original Docker mongodb over to my local computer and then finally the new Linode. I then went and restored the database dump with the command:

mongorestore mongodb://127.0.0.1 --authenticationDatabase=admin -u admin -p YOURPASSWORD /pathto/yourdirectory

Day 76: Decided to do the most simple presentation of mongodb in ejs that I could do. I found a really helpful tutorial on it from Marina Kim, a developer from Khazakstan where she explains how to represent variables in ejs and then put your mongodb into a html table. It was quite helpful. You need express and mongoose.

Day 77: I am currently diving into the Dione Protocol because it is meant to be the "electricity trading blockchain" and the "private electricity decentralised hardware node blockchain". One of the main problems I currently see is the fact that it is Proof-of-stake (POS) and has forked Avalanche Layer 1. I still will investigate it further as there may be some privacy benefits to use this chain. They are also apparently building dedicated hardware to run on 100W of solar panels for a validator node.

Day 78: I was experimenting with these Temper USB plug and play (well nearly) temperature monitoring sensors on my server in my server room. The server room was getting a little warm and I wanted to check the temperatures remotely periodically. So I wrote a small shell script that polls the python script and outputs the temperature to command line. Quite neat. The python file is written by a third party and is open source.

Day 79: It has recently come to my attention that MOD 90 is an inherent pattern in the Bitcoin source code and the main supply equation for Bitcoin. It is a significant find and points to some facts that the person/people coding Bitcoin initially had a deep understanding of MOD 90, Vortex mathematics, Fibbonacci sequence among others including the smallest fractal unit of the Tesla 3,6,9 pattern. For more information about it please check out here. It could be very significant.