Master of Software Arts
I came across an interview of Richard Gabriel this morning. From the intro: “Richard Gabriel is a Distinguished Engineer at Sun Microsystems, where he researches the architecture, design, and implementation of very large systems, as well as development techniques for building them.”
In the interview, Mr Gabriel talks about his idea that there should be a degree equivalent to the MFA for programmers, which he calls a Master of Software Arts. He says:
So, because you can program well or poorly, and because most of it is creative (in that we don’t really know what we’re doing when we start out), my view is that we should train developers the way we train creative people like poets and artists. People may say, “Well, that sounds really nuts.” But what do people do when they’re being trained, for example, to get a Master of Fine Arts in poetry? They study great works of poetry. Do we do that in our software engineering disciplines? No. You don’t look at the source code for great pieces of software. Or look at the architecture of great pieces of software. You don’t look at their design. You don’t study the lives of great software designers. So, you don’t study the literature of the thing you’re trying to build.
He goes on to talk about how his process for writing poetry is similar to how he goes about writing code.
All of this is from the composer’s point of view, but I suppose something should be said about the other side of this analogy. Can a poem be thought of as a sort of program? I code a poem, software for brainframes that run any flavor of the English OS but particularly v.4.2.1. (The 4 refers to the development release: late modern (Beowulf was developed in pb0.1.5, and compiled in 1.1.2); .x refers to geography: British .1.1, Scotish .1.2, etc; US .2.1, Canadian .2.2; and so on.)
I enjoy analogies such as brain = computer, or language = programming, but only as long as the cart is placed on the proper side of the horse. Our brains came along first, and so did language, long before computers and programs were invented. Computers and programs seem analogous to brains and languages because language-using humans equipped with brains invented them. This is why we see ourselves as fecund metaphors for how computers work. Back when steam locomotives were all the rage, for example, train = brain analogies were common, and equally useful.
In the end, virtually any human technology could probably serve as a functional metaphor for cognition, but no analogy would ever be able to map one-to-one to what it if analogizing. Nor would we want it to; a model that accounts for every element in its original would become uselessly vast. A map at a scale of 1:1 would, for example, be irrelevent. Like the old joke, I don’t want everything — after all: where would I put it?