Last day with Haskell, and this time we grapple with classes and monads. That’s pretty much where most beginners give up in disgust…
And that’s too bad, because both (and the rest of advanced Haskell features) are very expressive (see below the Maze problem) and powerful. But classes are unusual (and the name tends to confuse Object Oriented people), while monads appear to solve a problem that is trivial in other languages (which is not true, or mostly not true).
Classes are more properly understood as interfaces, but the comparison can be misleading. Class elements are types that are guaranteed to provide implementation for specific functions. Combined with polymorphism, it allows writing functions that are more intimate with their arguments, while keeping purity and referential transparency (in some sense, it plays a role similar to functors in ML languages).
Classes nicely support monads (and other composition mechanisms) by providing interfaces these abstractions can build upon.
Monads are mechanisms to compose calculations. They also happen to solve the IO problem in Haskell, which is a nice (and significant) bonus. But the calculation composition is core. With it, one can have backtracking, continuation, probabilistic computing, anything you could think off. Thinking of anything actually is quite hard, as most programmers are not used to such freedom. For IO, it just happens to guarantee sequential evaluation.
There are a number of resources to learn about Monads (indeed, it seems the path to Haskell mastery must include writing at least one Monad tutorial). I found All About Monads very useful. The introduction is really good, but links to this tutorial tend to disappear, unfortunately. There is also the Monads chapter of Real World Haskell (which you should read, but wait for the second edition to buy). This introduction is more complex, as it builds a State Monad rather than the Maybe Monad.
Finally, A Fistful of Monads from Learn You a Haskell for Great Good also covers the Maybe Monad as an introduction. I have the book but did not read it yet, so I cannot comment on it, but I have seen great reviews.
Lookup function returning Maybe
my_lookup is easy; it iterates over a list of pairs key, value, and returns
Just value when the key matches. On empty list, it returns
Nothing. There is no need to think about monads at this point. A key,value map is really some data structure that
Maybe contains a specific key.
Slightly more difficult was the
testData. The nesting was somewhat tricky to get right.
1 2 3 4 5 6 7 8
Once this is defined, using it with the
>>= operator is really simple:
1 2 3 4
Using the List Monad to solve problems is very similar to using Prolog: elements in a list are alternative paths; failure (which must be explicit by calling
guard) backtracks to the next alternative; [
return] adds a solution to the list of solutions (even if there’s only one possible solution, the List Monad produces a list).
To solve the maze, the algorithm do the following:
loopsubfunction is used to explore a given solution; the List Monad hides the iteration and backtracking over alternatives
loopfunction is always called with the reverse path so far: the first element is actually the current position
- if the current position is the exit position, the path is reversed then returned as solution
- otherwise, the current node is checked, and its exits retrieved
- the positions in the path are first removed from the exits, to avoid looping (so we never go over the same position twice)
- the List Monad main logic starts there:
guardthat the list of possible exits is not empty
- then select and alternative new position
loopon the new path to explore it
The code is fairly short, and perhaps could be shorter. The backtracking is provided for free by the List Monad, but very effective to implement a search (I used it to solve Sudoku problems).
To be fair, I took some time to track a bug: instead of adding the whole current path to the recursive
loop call, I only passed the tail. That caused
loop to actually ignore the current path, and run in circle forever. Once fixed, the search was instantaneous.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87
The rest of the code includes a parser for a specific maze description (I found sample mazes here), and code to process a problem into a
Maze instance, and code to display a solution.
Testing (test data here):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42
Wrapping up Day 3 and Haskell
The exercises (ok, just the maze one) were challenging and show or at least hint at Haskell strength: the ability to compose calculations from basic elements and add advanced control mechanisms almost transparently.
I enjoy Haskell; it allows to think at a higher level, to see computations from different angles, in a way that expands the mind (sometimes painfully). And while it is certainly not mainstream, there are high quality niches (banking and financial companies) that justify investing (stupid pun intended) in Haskell.