The last semester was a very busy one at my faculty. This is the main reason why I failed to update this blog. However, now that I have free time again, I’ll present some of the coding assignments that I’ve done during the last semester.
One of my teachers once said to some of his students: «Your job is both beautiful and cursed». He was referring to the fact that, in Computer Science, one has to always learn something new while being certain that what he learns will become obsolete in a shorter or longer period of time.
However, this is only a partial reason. At least, for the programming part of Computer Science.
We know that GHC (The Glorious Haskell Compiler) can output it’s machine code via assembly or via C. Someone once asked on Haskell-cafe if LLVM couldn’t be used, taking into account that once you get your code to LLVM it will be very easy to deploy it to several platforms. Though, a year ago, the answer was discouraging, a BSC Thesis by David Therei contradicted this. Right now, we can write very fast code in Haskell using the LLVM backend. If you need a benchmark, see here an example.
Now, a short review on the thesis. Starting with a few details on compiler design and quickly dwelling into GHC, David presents his work on implemententing the LLVM backend for GHC. His thesis ends with some short evaluations but I enjoyed the above example more.
His paper has enough code examples to make everything clearer to any begginer into the topic of compilers and I recommend reading it while it is still open to anyone wanting to work on this area.
To me, reading this paper was like descovering a hidden treasury. Had I read it sooner I wouldn’t make several mistakes in my Verilog compiler (the third article will be written shortly after this post).
I’ve read an ACM Turing Award article these days. It belongs to John Backus and it is called «Can Programming be Liberated from the von Neumann Style?»
Though some perceive this paper as an apology for creating FORTRAN, a read of the article will show several aspects which transcends this opinion. We know that programming languages tend to grow more and more in complexity and constructs trying to become more powerful – I will not take the NKS road in this article. However, the growth rate of the complexity is far superior to that of power, something which is discouraging. Moreover, combining two programs to make a powerful one is not straightforward and easy as one would expect. In fact, this is because common language programs bear a close resemblance to a machine: they are similar constructs of a von Neumann architecture.
Starting from this idea and trying to be able to compose programs in a simple way, John Bakus presents a new kind of programming: the functional one. Don’t think of Haskell, Scheme, Lisp right now. He even proves that a version of Lisp is not useful. He then creates a simple (not so really) programming language which considers everything to be a function with no named parameters involved. Read the article for more information.
A new Programming Language (APL) was created after his language. For the most curious of you, here we have a Conway’s Game of Life implementation in one single line of code. Mind blowing language, isn’t it?
For more informations, one can read Conal’s article here. In fact, it was his blog which directed me to Bakus’ work. A single article upon which I stumbled when looking for Functional Reactive Programming articles.
Today, Brent Yorgey announced a new issue of «The Monad Reader». The waiting of 10 days since submission deadline and today was worth it: a project from Google Summer of Code (Gergely Patai – a new profiler for Haskell), 3 useful monads (Edward Z. Yang – Logic, Prompt and Failure) – of which the Prompt Monad will be used in a personal project, hopefully before April -, a new way to write monads (Heinrich Apfelmus – an operational view of monads) and a nice implementation of STM in pure Haskell (Andrew Coppin).
I’ll be back shortly.