Stuck in the foothillsPublication date: 17 October 2011
Originally published 2011 in Atomic: Maximum Power Computing
Last modified 28-Jun-2012.
There may be a higher point somewhere nearby, but you have to go down, at least a little, before you can go up any further. If you program a robot to climb mountains by telling it to always go up, and never down, it will almost certainly never successfully climb a mountain at all. It'll get stuck on a local maximum - a foothill, or minor peak.
This concept is applicable to all sorts of things.
Say, for instance, that you're a young man who's decided to learn to play the accordion, to pick up girls. After five years of practice, you're a quite good accordion player, but have made the terrible discovery that your skills are not, in fact, very good at attracting women. You now realise that you should have learned the guitar. But to switch to guitar now would leave you a rank amateur, lost in the crowd of other men at parties with guitars they cannot actually play.
Or let's say you're a clever person in the Bronze Age, and have decided to invent writing. It seems quite sensible to you to just use a little picture for each word. There aren't that many words, after all. Why bother inventing an alphabet so you can make words out of multiple characters? Clay-tablet space is limited, people!
As it turns out, though, writing systems that use different characters for different words and syllables have serious problems. They may be beautiful and evocative and spatially compact (people porting Japanese video games to English have had a lot of fun compressing 20-character, 10-word speech bubbles into 20 English letters...), but they're difficult to learn and remember, and it's very difficult to figure out the meaning of an unknown character, or an unknown multi-character word. And let's just say you probably don't want the job of figuring out how a Chinese dictionary should be arranged. Don't even ask about typewriters.
(See also: Roman numerals, which made even simple arithmetic rather agonising.)
There's nothing intrinsically wrong with the obvious idea of using a little picture of a pig to mean "pig", but turn that into a whole written language and you'll end up with even serious scholars of the language sometimes being unable to remember how to write "to sneeze".
Non-alphabetic writing systems like these are a heck of a local maximum, though. Transitioning to alphabet-based script may actually be more painful than just persuading your whole country to switch to Esperanto.
The thing that makes local-maximum traps so deadly is that they're often hard to recognise - the metaphorical mountain is often hidden by fog. Local-maximum traps are especially hard to recognise when they're blocking efficient use of, or even the very development of, new technologies. By definition, everything most easily visible from a local maximum is not as good as that maximum.
The local maximum may not be a very nice place, either. From a purely economic point of view, slavery was (and still is...) clearly preferable to letting your slaves go and hiring free labourers, at least for the first few years of the transition. Hence the popularity of Slavery Lite™ systems like debt bondage, indentured servitude, exploitative sharecropping and truck systems after frank slavery was stamped out. (For some kinds of business it's arguable that well-run slavery is actually not just a local maximum but a truly good strategy, economically speaking.)
Or take the Space Shuttle. That fragile orbiter was strapped to the side of a monstrous fuel tank instead of sitting on top of it, purely so that the main engines, on the back of the orbiter instead of the far more logical base of that same fuel tank, could push it into space and then never be fired again for the rest of the mission. If the orbiter had been on top of the fuel tank and the main engines on the bottom of it - you know, where every other rocket has 'em - then the Challenger explosion would probably have been survivable, and the Columbia disaster would never have happened at all. The Shuttle never came anywhere near meeting its original design goals, but once everybody was committed to it, nobody wanted to scrap it and lose all of the money they'd already spent. So that's where US manned spaceflight stayed, for thirty years.
Military history is a long, long series of local maxima. Even if all you look at is the last hundred-odd years, you've got unsporting camouflaged troops murdering formation-marchers in natty uniforms, then khaki-clad infantry waves losing to machine guns, then battleships losing to aircraft carriers, then just about every kind of regular army losing to hit-and-fade guerrillas. In every case, the losers have been built into a force well-suited to face whatever they last fought, and changing is so much trouble that it often doesn't happen at all until a war is lost.
Computing is a young field - as late as World War II, a "computer" was still a person who computed - but it definitely has its own local-maximum problems. Big companies that're still using COBOL mega-programs from 1970 - or, much worse, Access mega-programs from 2000. Special hardware that still requires a parallel port for data transfer. Aunties and uncles who refuse to upgrade from Win98.
I'm more interested in those elusive fogged-in local maxima, though.
How about computer games that have cutscenes? Movies aren't punctuated by 30-second chunks of lousy game; I suspect the opposite situation, in which games are punctuated by often-rather-lengthy chunks of lousy story, is only so common because other ways of telling the story require a bit of a downhill walk. (Hideo Kojima is the undisputed king of this particular foothill.)
The computer-games industry has already managed to trudge down from some local maxima. The incomprehensible moon-logic adventure game died a well-deserved death in the 90s, saving the world from using duck on toothpaste to make cappuccino to kill robot. Games that made up for limited content with brutal difficulty also faded from the scene; the trip down that foothill was made a lot easier by beefier hardware and vastly greater storage, especially in home computers and consoles, which don't need to suck coins out of the player. We're also seeing a substantial movement, especially in the indie scene, away from graphics-fetishism and towards more interesting gameplay.
And then there's "professional" computer software that you're expected to spend some time learning, which is nonetheless convinced that every interface must be point-and-click, without a command-line anywhere.
I'm also haunted by the idea that today's faddish, headachey 3D computer monitors could be amazing wearable teleporters to cyberspace if only we'd worked harder on them earlier. Or perhaps the windows-icons-menu-pointer interface, dating back to the Xerox Alto in 1973, is only a tenth as usable as some other system a forgotten genius implemented on a Commodore 128 in 1986.
I bet I'm missing several much bigger painful changes that're just waiting to be made, though.
Climbing a mountain isn't easy, but it's impossible if you haven't found it.