Notes on codes, projects and everything
One of my recent tasks involving crawling a lot of geo-tagged data from a given service. The most recent one is crawling files containing a point cloud for a given location. So I began by observing the behavior in the browser. After exporting the list of HTTP requests involved in loading the application, I noticed there are a lot of requests fetching resources with a common rXXX
pattern.
This is the second part of the golang learning rant log. Previously on (note (code cslai)) I managed to make each line in the CSV into a hash map. So today I am going to make it into JSON Lines.
Often times, I am dealing with JSONL files, though panda’s DataFrame is great (and blaze to certain extend), however it is offering too much for the job. Most of the received data is in the form of structured text and I do all sorts of work with them. For example checking for consistency, doing replace based on values of other columns, stripping whitespace etc.
Long long time ago when I was working with Prolog, I was introduced to list. Unlike arrays in most popular programming languages, we weren’t really able to access to a particular member directly. Every list is constructed in a chain-like structure.
I often struggle to get my Javascript code organized, and have tried numerous ways to do so. I have tried putting relevant code into classes and instantiate as needed, then abuse jQuery’s data()
method to store everything (from scalar values to functions and callbacks). Recently, after knowing (briefly) how a jQuery plugin should be written, it does greatly simplify my code.
Recently I volunteered in building a site that reports whether certain websites are blocked locally (please don’t ask why that is happening). As it is a very simple app reporting status I wanted it to be easily scrape-able. One of the decision made was I want it to have things to see on first load, this practically removes the possibility of using react, which is my current favorite.