The screen you'll be using tomorrowOriginally published 2004 in Atomic: Maximum Power Computing Last modified 03-Dec-2011.
Monitors have never been exactly a hive of innovation compared with the rest of the PC market.
First, there were CRTs that cost so much that a 17 incher was luxury. Then there were CRTs cheap enough that 19 inchers sprouted on half of the world's desks. Around the same time, there were LCDs whose price/performance ratio was dreadful. And now better, cheaper LCDs are in the process of eating the whole mainstream display market.
The big-consumer-CRT-monitor market, in particular, now hardly even exists any more. A while ago, I had the joy of waiting months for one solitary 21 inch Samsung SyncMaster 1100p Plus to be shipped to Australia for me.
And that's about it. I've just described the last 20 years of PC display development. It'd take me a wee bit longer to do the same thing for CPUs.
We're all, of course, hungry for something new and thrilling to replace LCD and CRT, if only so we can have that cool witchcrafty feeling that only comes from using things that couldn't be bought at any price ten years ago.
Unfortunately, besides a couple of possibly-nifty LCD variants, our hunger looks set to continue for a while yet.
Everybody got all hot and bothered about Organic Light Emitting Diode (OLED) screens a year or three ago, but the darn things still don't seem to last much longer than 2000 hours.
(Well, technically it's just the blue OLEDs that don't last very long; green lasts a lot longer and red lasts for ages. But people don't seem all that crazy about paying for a TV that turns yellow, then red.)
Which is a bit of a problem. We can already make desktop-sized OLED screens, as well as the tiny ones you can buy on some phones and cameras and MP3 players today, but nobody wants to make desktop OLEDs if the things can still be expected to die in one year if you use them for 5.5 hours a day.
Plasma, in case you're wondering, isn't likely to ever make an impact in the PC display market. The problem there is that the "feature size" is too big; nobody can make a plasma panel with pixels anything like as small as you need for a computer monitor. And dearly though we'd all like a huge big-pixelled PC-resolution plasma to use from a bit further away than a regular monitor, we probably don't want to pay $50,000 for it.
Oh, and Liquid Crystal on Silicon (LCoS) didn't quite happen for monitors.
So that's the bad news. But a couple of other technologies are looking rather interesting.
First up, there's Surface-conduction Electron-emitter Display, mercifully abbreviated "SED". It's a phosphor-coated screen with a vacuum behind it, like a CRT, but it's got hard pixels each made up of three rectangular subpixels, like an LCD. Each subpixel has its own tiny low voltage electron emitter right behind it, instead of being scanned by a magnetically-guided high voltage electron beam from afar, as with a CRT.
SED offers high brightness and a great contrast ratio, like a CRT (or better), but in a hang-on-the-wall form factor, like plasma, and with power consumption around that of LCDs. CRTs draw about twice as much juice for a given screen size, and plasma screens are even worse, which is why they need cooling fans.
The first Toshiba (co-developed with Canon) SED HDTVs will be hitting the market Real Soon Now for you've-got-to-be-kidding prices. But only ten years ago a VGA (640 by 480, sixteen colour) active matrix LCD monitor could set you back $AU5000, so SED TVs ought to sprout in ordinary homes soon enough.
And, unlike plasma, SED can go small. The smallest prototype SED electron emitters anybody's managed to make so far are only a few nanometres across.
An emitter is not a subpixel; add a control matrix and phosphor blobs and things get much bigger. But when you consider that the highest density displays in at all common use today - 1920 by 1200 WUXGA 15 inch widescreens for laptops - are made out of subpixels that're around fifty-six thousand nanometres wide, I think you can see SED's potential.
And then, there's projectors.
What's the big problem with today's relatively cheap, highly portable, long-lamp-life front projectors?
They don't work well when the lights are on, that's what. The picture goes grey.
But somebody at Sony had a thought.
That thought was "Hey, video projectors only spit out three colours of light, right? Narrow-band red, green and blue? How about we make a projector screen that's got filters over it that only let those narrow bands through? Wouldn't that give you a screen that reflects very little ambient light - only the light that falls in those three narrow bands - but lights up all bright and magical when you point a projector at it?"
Yes, it would, and yes, it does, and Sony and whatever other companies license or rip off the idea ought to have the new "dark screens" (Sony seem to be calling theirs the "Hi-Contrast Screen") on the market pretty soon. They're certain to cost a lot more than a second hand garage-sale slide projector screen, and they don't totally cure the ambient light problem, but they apparently come so close that you really will be able to use a front projector for ordinary everyday TV watching, as well as dim-the-lights movies.
For everyday computing, not so much - but for games, hell yeah. (That's a technical term.)
Projector technology's improving pretty rapidly, too, but not as fast as projector prices are falling.
So take heart, fellow screen-starers. With super-contrast screens for our projectors, and SED monitors that shouldn't break a sweat displaying 6400 by 4800 (that's about print resolution on a 20 inch screen), the future looks not just bright, but also smooth, and contrasty.