Notes on codes, projects and everything
Getting comfortable to asyncio takes a bit of practice, so I revisited a practice project I did when I was working for my previous company. Suppose I want to build a very simple websocket application, without use of any web application library/framework. In order to keep it simple, I also opt to just build the frontend with minimal setup (just plain ES6 without webpack/vite).
(more…)My cloud storage is nearly exploding, and I am not in a good position to start subscribing for more storage (Yes, I am still #opentowork). Considering I just moved my domain settings to Cloudflare and started using Cloudflare tunnel, I figure I probably should just back up some of my photos, and host it on my workstation.
(more…)I was invited to try Go (the programming language, not that board game) a few months ago, however I didn’t complete back then. The main reason was because it felt raw, compared to other languages that I know a fair bit better (for example Ruby). There was no much syntatic sugar around, and getting some work done with it feels “dirty”.
Recently the term “Semantic Web” becomes extremely popular that Sitepoint blogs keep posting articles on this topic (1, 2). In my college days, I learned about Semantic Network and I wonder if there is some relationship between them. I’m not sure whether I get the concept correctly but in this article I would like to revise a bit on semantic network before going to semantic web. Please correct me if I’m wrong.
Just happened to see this post a few months ago, and the author created another cloud that uses almost the same technique to ‘visualize’ a list of countries. The author uses PHP to generate the cloud originally and I thought I may be able to do in javascript. After some quick coding I managed to produce something similar to the first example, source code after the jump.
Often times, I am dealing with JSONL files, though panda’s DataFrame is great (and blaze to certain extend), however it is offering too much for the job. Most of the received data is in the form of structured text and I do all sorts of work with them. For example checking for consistency, doing replace based on values of other columns, stripping whitespace etc.
Recently I switched my search code to Annoy because the input dataset is huge (7.5mil records with 20k dictionary count). It wasn’t without issues though, however I would probably talk about it next time. In order to figure out what each parameters meant, I spent some time watching through the talk given by the author @fulhack.