Odi et Amo: Programming Edition
One of the things about being a programmer is that you use a lot of different tools, where “tool” is roughly defined as a discrete program that accomplishes some particular task. IDEs are tools; the command line is many tools accessed through a common interface; programming languages themselves are tools, which are typically surrounded by a larger ecosystem of other tools that are necessary to get things done; and so on. Most of these tools can be combined with one another, so a given user’s toolchain can, in theory, comprise any one of a exponentially exploding number of tool combinations.
The tragedy of the situation is how bad so many of these tools are.
There’s a phenomenon which I think is more prevalent in the software development world than in other technical disciplines which licenses half-ass work. There are a lot of contributing factors to this attitude. For example, the fact that most software isn’t running critical operations means that bugs are, relatively speaking, low priority. It’s just not that important to get it right the first time, unlike, say, building a bridge or engineering a car. Combine that with the valorization of hacking and you get a recipe for lots of shoddy software. Furthermore, there’s a legacy element to software that makes it hard to correct old mistakes or fix bad old tools; unlike physical tools, which wear down and have to be replaced, code is basically forever. So if a bad tool makes its way into the chain by e.g. having been constructed earlier than others, it’s near-impossible to get rid of it.
The other attitudinal problem that software development has as a community is a predisposition towards “tough it out” thinking. This is manifested as a predisposition to disregard such factors as user experience and design quality and shift the blame for any problems with the tool onto the user. These attitudes usually don’t appear (or are repressed) in environments where user experience is actually important, i.e. where money changes hands, but those are situations in which software developers are selling their product to non-developers or the general public, rather than creating tools for each other. When it comes for tools that we actually use, the conventional wisdom is that it’s the user’s responsibility to adapt to the tool, rather than the opposite. This problem is exacerbated by the fact that tools which have become entrenched from early adoption in the days when almost no tools existed have only marginal incentive to improve.
As a result of these factors, we’re left using a lot of extremely shitty tools to perform development tasks. The various shells that originated with Unix and have been passed down to its spiritual descendants are uniformly terrible. Shell scripts have a barely-comprehensible syntax while providing a fraction of the power of real programming languages. And yet these pieces of legacy code are used to maintain entire operating systems. That seems clearly absurd, but in a world where no scripting languages existed, a shitty scripting language was better than none.
Of course, when real scripting languages appeared they only improved the situation marginally. Perl was still shitty, but it allowed you to break so many more things and it had regexes as first-class objects for some reason so you didn’t have to pipe your text to awk (and learn yet another syntax), so people jumped all over it. Ignoring the fact that it initially had almost none of the useful abstractions of more advanced programming languages like Common Lisp; ignoring that its syntax was cobbled together from an unholy mixture of C, awk, and shell; ignoring that Larry Wall, for reasons that can only be assumed to be sadistic, designed it to mimic natural language; the internet went bonkers over it and adopted it wholesale. And now, even though it’s slowly dying out, every once in a while a programmer logs into some old system and discovers some legacy Perl scripts whose author is long gone and which are utterly incomprehensible. The old saw about Perl being the glue of the internet is apt: it’s a nasty, tacky substance which gets into small niches from which it is impossible to extract. Perl is built on the philosophy that there’s more than one way to do it and you should never be prevented from picking the wrong way. Perl sucks.
Autotools sucks. Autotools started out as a script that some guy wrote to manage his Makefiles, which also suck. Autotools is conceptually a correct idea, namely, that you shouldn’t have to write per-configuration Makefiles by hand, but it goes about it a bizarre way. Take a look at the truly byzantine dependency graph on the Autotools Wikipedia page. There are a ton of moving parts, each one with its own configuration logic. Naturally all of this is written in shell script instead of a real language, so Cthulhu help you if you ever have to get down into the weeds of Autotools. Most people run it like a ritual invocation; you just do the minimal amount necessary to get your project to build and hope nothing ever breaks. Actually, Autotools is written in at least two languages, because it uses the m4 macro processor, which Kernighan and Ritchie wrote for C back in the Paleolithic. Hmm, what other language do I know of that is useful for writing domain-specific languages because of its advanced macro capabilities? But of course using Common Lisp would have been too obvious, so m4, which, again, has a totally incomprehensible syntax and no real programming language functionality to speak of, is what gets used. As a result of a lot of talented people wasting a substantial portion of their lives, Autotools has been brought to a level where people can actually use it, with the result that this nightmarish rats’ nest of code has become irreplaceable basically forever. Autotools is the abyss.
PHP is a fractal of bad design. Created by someone who wasn’t really interested in programming for people who also apparently aren’t interested in programming, or consistency, or reliable behavior, or really any other normal markings of a functional piece of software, PHP caught on like consumption at a gathering of 19th century Romantics, because it allowed you to make terrible web pages, which is what everyone in the 90s wanted to do. However many number of revisions later, people are still phasing out shitty old features of the original language in the hopes of someday creating something one-third as pleasant to use as Python.
Make is terrible and confusing. Autotools was create so that you wouldn’t have to write Makefiles by hand, which tells you something about what a pleasant experience that was. Make was initially purely a rule-based system, but at some point it dawned on folks that perhaps they’d like to have things like “iteration” and “conditionals” in their build process, so naturally those got grafted onto Make during, I assume, some sort of Witches’ Sabbath, with Satan’s presence consecrating the unholy union. Despite being created contemporaneously with many of the tools mentioned above, Make does not share syntax with them. In order to avoid writing Makefiles, a complicated tool called cmake was invented, which allows you to write the files that write the Makefiles in yet another syntax which in comprehensibility is somewhere between make itself and a shell script. As per Greenspun’s 10th Rule, Make almost certainly contains at least a portion of a working Common Lisp interpreter. Make sucks. All these terrible and weird legacy pieces of code have survived down the generations from early times when they were nothing more than convenient hacks that made it possible to automate things. Over the years, they’ve accreted corrections and version numbers and functionality and eventually the process of using them was either made somewhat tolerable or most users were insulated from the messy core by layers and layers of supporting infrastructure. Because replacing old stuff is hard, and because code doesn’t wear out the way that hardware does (and also because most of the cost of usability fall on the developers themselves), these tools just persist forever. Any discussion of their terrible usability or their shortcomings is met with, at best, indifferent shrugs (“It’s too bad, but who’s going to take on that job?") or outright hostility. People become habituated to their tools and view any suggestion that they might be inadequate as a personal attack. Just check out that PHP post, in which a bunch of people in the comments defend PHP on the grounds that “it’s weird but we’ve gotten used to it!” Well, you can get used to driving a car with a faulty alignment or driving nails with a microscope, but that doesn’t mean you should. If you bring up the point of Perl’s syntax or its weird referencing rules, you’ll be told that you should just memorize these things and once you do it’s not that big of a deal. Suggestions that perhaps knowledge of modern programming practices should be put to good use by creating replacements for tools that behave in opaque and hard-to-understand ways are greeted with incredulity at the heresy. As developers, I think we do ourselves no favors this way. We should demand, and work to build, better tools. We should have build systems that can be configured in a language that’s easy to parse and understand. We should make use of the strengths of the languages we do have, so that when we need a macro expander, we have one in Lisp or one of its variants. We should have languages that don’t confuse us with unnecessary visual clutter and which are easy to read. We should not be afraid of abandoning old tools because they’re old and were created by esteemed personages at the dawn of programming. We should, above all, pay lots of attention to human factors and usability studies because human time is precious but programming time is cheap. We should, in the end, not be afraid of change, of learning from past mistakes, and of abandoning rather than perpetuating legacy code. That’s my presidential platform; write me in next year in November.