Tuesday, 27 May 2008

Signs of Good Design

It is a good sign when something you design solves problems not originally envisaged. This indicates depth and robustness.

Another good sign is when a third party of high standing makes a recommendation.

Both these signs are apparent in the increasing adoption of the Erlang language and libraries for distributed computing problems. Originally designed for telecommunications software, Erlang is now being used by Amazon, Facebook, and others as part of their infrastructure.

Also, Steve Vinoski, an authority on CORBA, an older technology for distributed computing, has been singing the praises of Erlang for largely "getting it right". In response Joe Armstrong (inventor of Erlang) has thanked Steve for going down the wrong path and living to tell the tale. Here's Joe on Steve, and Steve on Joe.

I hope that everyone working in distributed computing can take note of the lessons that Steve (and Joe) learned the hard way, without necessarily repeating all the hard yards.

Remember: It is always a good time to learn from other people's mistakes (and your own).

Thursday, 22 May 2008

Got Art?

Last night my beloved and I splurged on a piece of modern art. In the past we've made our own -- in a "Jackson Pollock" style -- but Andrea spotted this work by local artist Gary Solomon while driving home yesterday:

Go 'Pies!

A screech of brakes, much excitement, a quick discussion of finances, and we're newly minted patrons of the arts. Now we just need more wall-space (and money) to grow our collection.

Thursday, 15 May 2008

Programming Yin and Programming Yang

Q: Why is programming fun?
A: It is a form of play and exploration in which a "one-time" "small" effort (writing the program) yields a large reward (the program does something). It combines the artistic joy of creation with the scientific reward of nutting out a puzzle.

Q: Why is programming hell?
A1: When the program fails to produce the expected results a diagnostic process of "debugging" follows. As the program grows in size and power and (hence) complexity, this debugging phase dominates the programmer's time, and brings mainly relief rather than reward.

A2: When requirements change the program may need to be re-jigged to accommodate them, and this re-jigging, similar to debugging lacks the immediate rewards.

* * *

So, to increase the rewards of programming -- and incidentally productivity and profitability -- techniques are sought to reduce the dross (debugging and re-jiggering), and increase the rewards (features and elegance).

On the Yang side of programming I include design, architecture, algorithm invention, and implementation of new features.

On the Yin side I include practices like Design By Contract (DBC), Test Driven Development (TDD), and Continuous Integration (CI). These provide no up-front reward, but pay for themselves over time.

Interestingly, all three involve using programming to write better programs. DBC and TDD may be regarded -- with some licence -- as simple forms of meta-programming. We are writing program fragments to help test our programs. Also, the act of making tacit assumptions explicit helps us to think about and hence improve the underlying design.

By putting in place these effective (Yin) measures, we can be bolder (Yang) in adding new functionality, having built a software safety net as we step forward.

True meta-programming, writing programs that write programs brings us back to the Yang side.
We are now truly working at a higher level of abstraction, with correspondingly more leverage, but when bugs appear in our meta-programs the debugging gets harder too ...

Monday, 12 May 2008

Fred Brookes on Online Collaboration

DL Weinreb, one of the co-creators of Common Lisp, took some notes at the OOPSLA 2007 conference. One of the talks was by given by Fred Brooks:
Fredrick Brooks Jr., author of the classic book “The Mythical Man-Month”, talked about telecollaboration. Most of the talk was about collaboration itself, and under what circumstances it’s a good thing: not always! His main point is that collaboration is great for determining system requirements and brainstorming about possible approaches, but that you really need a single system architect in order to achive conceptual integrity. The system architect can delegate parts of the architecture to others (e.g. the user interface czar), but he distinguishes sharply between delegating design (OK) and sharing design (not OK).
Readers of The MMM, will not be surprised by Brooks's reiteration of the need for a single designer to ensure a unified vision, but it's nice to have strong position brought to light in a new context.

For me, Brooks's pronouncement raises a couple of questions:
  • To what extent does his advice apply to non-software enterprises?
  • To what extent should collaborative software endeavour to encourage "good practice" through constraining its users?

Monday, 5 May 2008

Algorithms vs Architecture

There are many facets to software design. Some, like user-interface design, are apparent to the end user. Others take place almost entirely under the hood. These are the bones on which the software is built.

Two significant under-the-hood aspects are algorithms and software architecture.

Algorithms form a major strand of computer science. They are concerned with
the nitty-gritty of getting the computer to do stuff. The major aspects of an
algorithm include: what it does, its domain of valid inputs, and its time- and

Software architecture, on the other hand, is concerned with the high-level structure and conventions of a software system, and is more of a topic for industry than academia. Good architectural choices contribute to the overall performance, resilience, and extensibility of a system.

We need both! And interestingly, you neglect either one at your peril.

For example, recently when we noticed a significant slow-down in performance the issue turned out to be an architecture-algorithm interaction. While the algorithm was O(n^2) in time, an imperfection in the architecture led to it being executed as each item was added to the structure, making it effectively O(n^3), and correspondingly sluggish.

As usual, when the problem is tricky to track down, it is often due to the interaction of two separate-seeming factors.