Notes on codes, projects and everything
Getting comfortable to asyncio takes a bit of practice, so I revisited a practice project I did when I was working for my previous company. Suppose I want to build a very simple websocket application, without use of any web application library/framework. In order to keep it simple, I also opt to just build the frontend with minimal setup (just plain ES6 without webpack/vite).
(more…)My cloud storage is nearly exploding, and I am not in a good position to start subscribing for more storage (Yes, I am still #opentowork). Considering I just moved my domain settings to Cloudflare and started using Cloudflare tunnel, I figure I probably should just back up some of my photos, and host it on my workstation.
(more…)Being new to asyncio, after publishing the previous post on running multiple applications in one event loop, I also cross posted it to the discussion board for feedback. So apparently instead of cramming everything to the same event loop, it would be better if each application run on a separate thread. That makes sense, considering all the code that was written for that.
(more…)I used to develop a bot, partly for work, that fetches current latest petrol retail price in Malaysia. The bot was really an experiment, but at the time it worked well. Then a few years later, out of boredom, I revisited the project after finding the telegram bot library is moving towards asyncio. It was great (at least a lot of people rave about it), but also at the same time intimidating, I learned about coroutines and used gevent in the past, but not asyncio itself.
(more…)This is the formal draft of my statistical analysis report for the social audit project previously mentioned here. As the project is public by nature, I am cross-posting here for own reference.
(more…)The Internet Censorship Dashboard is a project that aggregates data fetched from the OONI API, to provide an overview of the current state of Internet Censorship experienced by users mainly in Southeast Asia. The current form was built a couple of years ago, and recently got funded to get it updated to work better with new APIs.
(more…)Back then, when I was still working on my postgraduate degree research, I used RDF, which was the preferred format in the world of Semantic Web to represent data. I eventually dropped the degree, and stopped following the development of the related technology and standards. Until I volunteered to update the import script for popit when I was looking for the next job/project.
(more…)In recent years, I start to make my development environment decouple from the tools delivered by the package manager used by the operating system. The tools (compiler, interpreters, libraries etc) are usually best left unmodified so other system packages that rely on them keeps working as intended. Also another reason for the setup is I wanted to follow the latest release as much as possible, which cannot be done unless I enroll myself to a rolling release distro.
(more…)Just recently I volunteered to do a pre-101 kinda workshop for people wanting to learn programming. I had done this a few times in the past, but in different settings and goals in mind. The whole structure predates the sessions but I can’t remember when I first created them.
(more…)This is the year I kept digging my old undergraduate notes on Statistics for work. First was my brief attempt wearing the Data Scientist performing ANOVA test to see if there’s correlation between pairs of variables. Then just recently I was tasked to analyze a survey result for a social audit project.
(more…)In recent years, I start to make my development environment decouple from the tools delivered by the package manager used by the operating system. The tools (compiler, interpreters, libraries etc) are usually best left unmodified so other system packages that rely on them keeps working as intended. Also another reason for the setup is I wanted to follow the latest release as much as possible, which cannot be done unless I enroll myself to a rolling release distro.
(more…)One of my recent tasks involving crawling a lot of geo-tagged data from a given service. The most recent one is crawling files containing a point cloud for a given location. So I began by observing the behavior in the browser. After exporting the list of HTTP requests involved in loading the application, I noticed there are a lot of requests fetching resources with a common rXXX pattern.
So my cheat with dask worked fine and dandy, until I started inspecting the output (which was to be used as an input for another script). While the script seemed to work fine, however when I started to parse each line I was hit with some funny syntax errors. After some quick inspection I found some of the lines was not printed completely.
This is the formal draft of my statistical analysis report for the social audit project previously mentioned here. As the project is public by nature, I am cross-posting here for own reference.
(more…)After delaying for quite some time, I think I should start the project before I get bored with it. The project will be either hosted on this current domain (coolsilon.com) at least for now and will probably move to another domain if needed. The site will be either a blog aggregator or just a simple article submission site that works kinda like digg / reddit, however, to be promoted to the frontpage the submission would have to impress the opposite group.