Absolutely accurate predictions
(Actually, your guess is as good as mine, but this is my site.)
Originally published 2005 in Atomic: Maximum Power Computing Last modified 03-Dec-2011.
I've been writing these columns for nearly four years now, which is quite a while in information technology.
What's going to happen in the next four?
Well, I'm not expecting a huge amount of change in basic hardware capabilities.
When I wrote Ground Zero #1, a 1.33GHz Athlon was the god of all PC CPUs.
Disappointingly, today's three-point-whatever gigahertz P4s and two-point-whatever gigahertz Athlon 64s only manage about four times the performance of a 1.33GHz Athlon, tops, for optimised code. They beat the old Athlon by a factor of three or less for other tasks. To avoid disappointment, expect only twice the performance.
That's worth having, certainly, but not nearly as exciting as the change from 1995 to 1999. Over that period, we stepped up from the Pentium 200 to the Pentium III 800EB; the clock speed factor increase was bigger, and the performance increase was, generally speaking, commensurately bigger.
Dual-core processors are becoming a sensible proposition now; the 2.8 and 3.0GHz Intel duals are quite sensibly priced, and between them and the AMD X2s we should have a reasonable stopgap measure while the chip makers figure out how to ramp up performance in other ways. Big changes in the near future are still, however, very unlikely.
I'm also not expecting a huge improvement in graphics hardware speed. Or, at least, in the impressiveness of what you can do with said hardware.
When I started writing these columns, the GeForce3 was new and exciting. And, like the 1.3GHz Athlon, it still doesn't stink. The current Nvidia and ATI boards are a bunch of fun, and all, but you can play Half-Life 2 and Doom 3 quite well on an old Athlon/GF3 box (or worse). And I'm not alone in preferring Tribes 2 on that machine over Tribes 3 on a brand new god-box.
Lots of people have bought CPU and/or graphics upgrades because of the big FPS releases recently, but there hasn't been that much upgrade pressure, and software will continue to lag hardware capabilities, because game companies don't want to minimum-spec themselves out of the market.
By 2009, though, those of us on the bleeding edge may at least be able to play a proper hordes-of-monsters Doom game using the Doom 3 engine. Aww, yeah.
So never mind the transistor counts. What genuinely new stuff should we be getting over the next few years?
Surface-conduction Electron-emitter Display (SED) computer monitors look as if they'll actually make it to market in the next four years, provided they don't get buried in litigation.
I wrote about SED three columns ago; it's already dribbling into the grey zone between prototype and real stupidly expensive early adopter product, and delivers CRT-level image quality in plasma-screen form factors, but without plasma's huge resolution-limiting pixels and high power consumption. Liquid Crystal on Silicon (LCoS) screens turned out to be a non-starter - the tech works for projectors, but several major corporations have dropped a whole lot of money into discovering that it doesn't work for monitors. SED could be very tasty, though.
And then there's micro-projectors - occasionally called "holographic projectors", but having nothing to do with R2-D2's integrated imaging device. They're just little tiny video projectors, possibly using a standard light-source-and-panel architecture (lit by a giant LED), possibly using scanning lasers (red lasers, we've got; compact low power green and blue to complete a colour picture may well be doable in a few years).
Mitsubishi already have a tiny LED-lit projector that looks like a regular business unit, only smaller; more will follow it. And of course, if all you need is an alphanumeric display, that's completely trivial.
However they happen, micro-projectors will let devices like PDAs or mobile phones deliver small-computer-monitor image sizes, provided there isn't too much ambient light. The image quality will be limited by the quality of the surface you're projecting on, but a lot of people would be perfectly happy with a red-only monochrome laser scanner display if it gave even 640 by 480 resolution. That'd be a huge leap forward for cell-phone user interface capabilities, not to mention let you play Space Invaders on the ceiling with the lights out.
Also coming: Transparent circuits. Semiconductor materials you can see through - which may also end up being tough and surprisingly easy to make, so may turn up in all sorts of places where transparency isn't actually necessary, and just look cool. They'll also, of course, be useful for sci-fi displays that look like a pane of glass when inactive, and liquid crystal displays with no "screen door" effect, but also for displays with built-in, invisible, computing power.
I also suspect that software-defined radio is gonna be huge. The term "software radio" makes people think it's just about replacing your FM tuner, but "radio" is used generically here. Any frequency (or any spread-spectrum frequency package), any signal; all done by a general purpose radio computer that can pull signals right out of untuned RF hash.
Software radio can receive video transmissions, data transmissions, encrypted satellite TV that you're meant to be paying for, anything. And send it too, of course, if you've got a transmitter. You need an RF front end to feed high frequency signals to a software radio board, but apart from that all you need is appropriate antennas and the board, and that's just DSP hardware that's already available to enthusiasts for less than a thousand bucks Australian.
Once you've got the hardware, you just run receiver software for whatever you want to receive. The software, legal or not, seems likely to be distributed around the world as fast as the No-CD patch for the latest FPS.
Never mind ripping off pay TV; this'll let people invent whole new wireless networking protocols as fast as they like, to stay one step ahead of The Man.