Less go, more show
Originally published 2004 in Atomic: Maximum Power Computing Last modified 03-Dec-2011.
It's against my basic nature, but I'm quite happy about the widely reported Depressing Current Trends in the performance computing market.
Like, for instance, the home video game market moving away from PCs, with their zillion and three quirky hardware configurations and substantial piracy problem, to nice standardised less-often-warezed consoles.
And, in a related development, PC gamers stepping off the upgrade path. Or, at least, standing still on it for a while.
Games are still driving many PC enthusiasts to get the shiniest new processor and video card and motherboard with UV-reactive trimmings; there's been a spasm of upgrading around now that Doom 3's out (not that you have to have the latest and greatest hardware to run it; ATI and Nvidia must be cursing the scalability of the Doom 3 engine), and the ever-present threat of Half-Life 2 actually being released some time soon is spurring people into questionably sensible preparatory upgrades. Because, of course, the world will unquestionably end if they have to play their new game at only 25 frames per second for a day and a half.
But more and more of us are asking whether there's really any extra fun to be had just because we can now turn on all of the DirectX 9 stuff, and run at 1600 by 1200 with 8X FSAA.
(Particularly when we then get our asses whupped online by a 14 year old whose every waking out-of-school hour is spent playing our game of choice, in 800 by 600 jaggy-vision on his 750MHz Duron box. He's cheating, of course. Sure he is.)
The world's now not exactly short of people who've owned a high end GeForce FX or Radeon for the thick end of two years now, and who re-brained their perfectly good 1.6GHz Athlon box into a New Hotness overclocked P4 2.4C well over a year ago, and can't now quite remember why.
Like me, for instance.
Oh, sure, I gots da mad framez pa' second yo, and I can unzip 3.1Gb of Web logs from a 238Mb archive in a minute and a half. Which is about, um, eleventy kanillion times faster than the 40MHz Amiga I had when George H. W. Bush was POTUS.
But, y'know, woo-hoo to that. It's not as if the new machine understands what I say and replies in Majel Barrett's voice.
All of the personal computer tech companies whose stock market credibility hangs on double-digit sales growth every year are, right now, even more enthusiastic than usual about finding new things for ultra-fast PCs to do, so they can say something other than "Just Because" in their Why You Should Upgrade advertisements. AMD's at least got some pretty compelling basic performance figures behind their AMD64 products; Intel's really scratching to find reasons why people should care about LGA 775 and BTX, right now.
Whether there'll be a genuinely new MFLOPS-munching craze in the near future, beyond incrementally more demanding games and incrementally more bloated versions of Windows, is open to question. Only so many people are interested in home video editing, and I've been hanging out for consumer-market virtual reality gear since I was using the abovementioned Amiga.
In the meantime, though, the slackening of must-go-faster mania means computer-gear developers ought to have some more time to concentrate on user interfaces. This, people, is a very good thing.
I don't know about you, but I spend a lot of time sitting in front of this screen. A lot of time. All that time has blinded me to some extent to the myriad awfulnesses of Windows, and WinXP is certainly a great big improvement over Win95 (or, if you want to be really perverse, Win3.0...), but it ain't exactly news that Microsoft has a way to go yet. Apple are further ahead, but not that much further.
When everybody's talking processor power and screen resolution and number of Robert Ludlum novels per second, user interfaces invariably fall by the wayside. Good interfaces are hard. A decent user interface can take more development time than the hardware did, and you'd better not let the hardware engineers design it.
Interfaces can even force changes on the hardware, which is something that keeps project coordinators awake at night. There are bad things about doing interface work before, after or during hardware development.
Many companies therefore find that there's a lot to be said for, well, not doing interface work.
Make widget that can do many things. Provide some way for widget owner to direct widget to do these things, provided said widget owner has read 243-page Quick Start Guide and has a lot of spare time to navigate 28-line menus on a 3-line display. List widget's functions on brochure. Package widgets for sale. Remember to include bottle of Scotch with widgets being sent to reviewers.
(Seriously, guys. Remember.)
When an industry's customers and marketing departments are slavering along in a positive-feedback loop of specification-mania, as the PC industry usually is, the above process results in products with gigantic feature lists and incomprehensible user interfaces.
Don't even start me on mobile phones. Just look at MP3 players.
There are a few MP3 players that aren't much harder to use than either flavour of iPod, but a fair slice of the We're Not Worthy-ing aimed at the iPods is justified. Glory, hallelujah; someone knuckled down and made an MP3 player without a freakin' monkey-puzzle interface. (OK, there's a calendar and games in there for some damn reason, but at least they don't get in the way.)
The iPod hardware is cool, but it's nowhere near the whole story, even if you don't care about iTunes. Apple took until late 2001 to get the first iPod to market, and the reason for that was the interface, and the hardware/interface integration.
There'd been plenty of hard-drive digital music players before then, and iPods have never offered the most storage per dollar. If you're not a complete fashion victim, there's now a good case to be made for several other similarly-specced players - but that's mainly because the iPod spurred other manufacturers into making their products more usable.
It takes a dedicated trail-blazing company like Apple (or, uh, Pixo) to focus on usability when everyone else is just coming up with more spec-sheet boxes in which to put ticks.
But when consumers are scratching their heads and wondering exactly what it is they might want to do that a $1500 chain-store computer can't handle, everyone else in the business ought to start thinking harder about usability.
There'll be another craze along soon enough to give us all more of a reason for our muscle-car PCs. But in the meantime, I'm hoping for some products with more points assigned to Charisma than Strength.