Notes on codes, projects and everything

Working Environment Setup, for 2020

In recent years, I start to make my development environment decouple from the tools delivered by the package manager used by the operating system. The tools (compiler, interpreters, libraries etc) are usually best left unmodified so other system packages that rely on them keeps working as intended. Also another reason for the setup is I wanted to follow the latest release as much as possible, which cannot be done unless I enroll myself to a rolling release distro.

The setup should be also easily replicable, so I can have a similar setup in my WSL2 Ubuntu instance too, besides my main workstation running Ubuntu (yes, I install all my development tools to the WSL2 instance when I work on my Surface Book 2).

My current editor is Visual Studio Code, but I do use neovim too at times (managed by my own “plugin manager”, thinking to migrate to volt when it is ready for neovim).


Starting this with Python, which is my primary language for work. The setup for this is going to be similar to other languages shown later.

In order to use a specific release of CPython (or other implementations such as pypy, stackless etc.), I use pyenv to fetch and build it. Similar tools also exist for other languages, which will be detailed in their respective section.

New projects, are to be managed by poetry. I started with my own hack that somehow wrapping pip and the call to cpython with a shell script. Then I was introduced to pipenv, which performs similarly (but much elegantly), and eventually moved on to poetry as it was gaining more support.

Both pyenv and poetry require adding new PATH and some initial setup to be placed in .profile (or .bashrc, depending on how your system is setup).

In my editor, I mainly rely on the official plugins by the vscode team, with Pylance bundled as the language server. In addition to the bundled language server, I rely on black, pytest and prospector to do formatting, feature tests and linting (isort is pulled as a dependency while adding those package, and it is useful to sort imports)

The development packages mentioned would be installed to every single project with the following command (the --dev flag ensures that these dependencies are not pulled when end users pull and build the project).

$ poetry add --dev black pytest prospector

Additional information about the setup can be found here, which is much more extensive than what I have here.


Similarly, there are also two main components in my Rust setup. Officially rustup is the recommended tool to manage both compiler and the package manager, so I am using that too.

One downside with this setup is that in normal usage I cannot define a different version of compiler for each project. However, considering I don’t compile 2 different projects at the same time while both requiring different versions of the compiler, it still works fine.

Project management on the other hand is done by cargo, which should be installed alongside the compiler when rustup is used. As of other development tools, the unit test library is somehow built-in somewhere, so it is just a cargo test away. However, unlike in Python, linter (clippy) and formatter (rustfmt) are installed via the rustup utility too, so no need to install them separately for each project.

The plugin used in the editor, is the rust-analyzer by Aleksey Kladov. The vscode version of the plugin pulls the language server as well, so no need to worry about installing it manually.


I don’t do much Golang development, so I settled with Golang’s snap package and gofmt for now. Unlike other languages, the go command acts as both a compiler and a package manager.


Tooling in PHP is not as great compared to contemporary/modern languages like Golang and Rust. Even when we compile PHP to Python, the tooling feels relatively inferior.

Similar to my setup for Python, I have a tool to manage PHP versions, and another for package management. The tool to manage, and build PHP interpreter is phpenv, and the tool to perform package management for projects is composer.

The last project I worked in PHP was a campaign management system running on Laravel, so feature testing is provided by the framework itself. The linter and formatter, on the other hand are provided by PHP_Codesniffer and I follow PSR12 code format through the use of the tool.

On the editor side, I am currently using Intelephense. I don’t currently do enough PHP project to consider paying for the premium features though. However, on the other hand I am quite surprised to see there are no active and focused community effort, unlike in Rust/Golang in this area. This is the part I said PHP feels relatively inferior.

JavaScript / ES6 / Node.js

Almost all my work with JavaScript deals with frontend work, so no Deno for me, though I am rather intrigued by it.

I have NodeJS installed through the use of nodenv, which performs similarly like pyenv, phpenv and friends. Instead of using npm as the package manager, Yarn is used.

Unfortunately, in order to use the newer version of Yarn, it has to be done manually for every project. So after installing yarn globally through npm, in each of the project to be managed by Yarn, this command is used to switch the version of the package manager to the newer version.

$ yarn set version berry

As I am not too obsessed with tools (as long as they work), also the fact I don’t do as much frontend work, I don’t know why the setup has to be done this way.

Formatting, on the other hand is provided by prettier, which should be bundled by the editor plugin.


I briefly learned Ruby for a bit at the end of last year, through solving Advent of Code puzzles. So yea, I know the language, but not too interested in it (though it is a very sweet language full of syntactic sugar and generally feels shiny).

The official runtime is installed through the use of rbenv, like other friends, it allows each project to define a specific version. Bundler, on the other hand is the package manager.

Unit test is done through the minitest gem, and invoked by

bundle exec ruby -I test:lib test/test_calculator.rb

As the command is rather long, I usually placed that in a Makefile (I once found people express their disgust over my use of Makefile to perform tasks in sequence and I find it ridiculous lol).

I didn’t really need a formatter and linter as I was just learning the language out of boredom, while solving the advent of code puzzles.


While I find Ruby’s tooling unnecessarily clunky last year, Haskell is much worse in comparison. The official guide now endorses the use of ghcup to manage the compiler versions, which is what I am using too. However, almost half of the population (definitely not statistically correct, just an exaggeration) uses stack instead for this purpose.

From what I read, stack works like rustup + cargo, where it manages both the versions of compilers and dependency packages for every project. The ghcup utility that is the recommended tool works more like rustup, where it manages the compiler and some tools (like the language server, but the implementation is problematic).

Now my setup uses ghcup to manage the ghc compiler, the package manager is then done through the use of cabal-install. Speaking of this, and it reminds me of how bad the naming is. The library that powers both cabal-install and stack for package management, co-incidentally (or not) is also called Cabal. Good luck finding the proper documentation on how to use the utility command provided by cabal-install, which unfortunately also called cabal.

All the self-doubt in setting up the development environment @.@

Anyway, to fix the cabal/stack conflict, especially when I want to build haskell projects, or even working on exercism puzzles, I first convert the project from stack to cabal through the stack2cabal utility. However, for some reason, compiling this with different versions of the compiler yields different outcomes, some fails, some succeeded, no idea why.

So if you are like me, after converting a project, open up cabal.project, and remember to update this line

with-compiler: ghc-8.8.4

So it says the version of compiler you are currently using.

Now on writing code, DO NOT EVER attempt to install the language server through ghcup utility if you value your sanity. Fortunately for vscode user, the extension will download the matching language server itself. The reason for the warning was because every time you switch a compiler version you would have to uninstall and re-install the language server, otherwise things would fail. Even staying the same ghc version, the language server will randomly fail when you need it the most too.

Feel free to prove me wrong, but I will stick to the version downloaded by the extension.

The language server is good at linting and providing recommendations (if you watch my livestreams solving exercism puzzles, you will see me relying on it a lot). Unit test is done through hspec (defined by the package, I have not started a project from zero myself till now), and for it to print fancy colourful output, just do this

$ cabal test --test-show-details=always --test-options=--color

There should be a way to set this as default, but this works for now.

Formatting, on the other hand, is provided by Ormulu. It mostly just works, I don’t remember having to spend too much time to get it working.

Also, 8GB of memory is not enough to compile even a small package, and you may need more storage space to cache the built dependencies.

Have fun setting up the environment for haskell while maintaining your sanity.

What’s next?

Deciding to compile my own runtime/compiler is a hassle, especially when you are doing the first one, figuring out the system package to install is a pain to go through. A shortcut to this for ubuntu user, is to find the -dev package for the runtime/compiler, and see what are the dependencies required to build (e.g. php7.4-dev on groovy). Most of the languages share more or less the same set of dependencies, so you should only need to to through this once.

Besides not relying on the system package manager to provide the tools, the setup also allow work to be delivered as-is or through docker container. It somehow makes it easier for users to build the project in their environment as long as they follow the same setup, and theoretically should not have problems.

I recently also found asdf, which looks like a one-stop-shop to do all the above. I am currently rather happy with my current setup, and probably will not consider the use of that for now.

leave your comment

name is required

email is required

have a blog?

This blog uses scripts to assist and automate comment moderation, and the author of this blog post does not hold responsibility in the content of posted comments. Please note that activities such as flaming, ungrounded accusations as well as spamming will not be entertained.

Click to change color scheme