Notes on codes, projects and everything
This is the formal draft of my statistical analysis report for the social audit project previously mentioned here. As the project is public by nature, I am cross-posting here for own reference.
(more…)This is the year I kept digging my old undergraduate notes on Statistics for work. First was my brief attempt wearing the Data Scientist performing ANOVA test to see if there’s correlation between pairs of variables. Then just recently I was tasked to analyze a survey result for a social audit project.
(more…)After a miserable trip back to academic world, I finally re-gained the courage to get back to job-market. For the time spent in university, I spent quite some time reading about Semantic Web and RDF. Then I thought, I should have published more in this format in future. However, that didn’t really happen, mostly because I am too lazy.
Previously, I started practising recursions by implementing a type check on lat (list of atoms), and ismember
(whether an atom is a member of a given lat). Then in the third chapter, named “Cons the Magnificent”, more list manipulation methods are being introduced.
After publishing the previous note on setting up my development environment, I find myself spending more time in the CLI (usually via SSH from host). Then I find myself not needing all the GUI apps in a standard Ubuntu desktop environment so I went ahead and set up a new environment based on Ubuntu Quantal server edition beta-1. For some reason my network stopped working and didn’t really want to spend time finding out the cause, so I reinstalled everything again today using the final installer, as well as the updated Virtualbox 4.2.6.
I have been following this excellent guide written by Benjamin Thomas to set up my virtual machine for development purpose. However, when I am starting to configure a Ubuntu Quantal alpha machine, parts of the guide became inapplicable. Hence, this post is written as a small revision to the previously mentioned guide.
One of my recent tasks involving crawling a lot of geo-tagged data from a given service. The most recent one is crawling files containing a point cloud for a given location. So I began by observing the behavior in the browser. After exporting the list of HTTP requests involved in loading the application, I noticed there are a lot of requests fetching resources with a common rXXX
pattern.