The price of powerThis page published 20 August 2002. Last modified 03-Dec-2011.
Back at the start of 2001, various parts of California were subject to rolling blackouts - deliberate power cuts, done to reduce the load on an overtaxed electricity grid. Without the deliberate blackouts, there was a good chance of serious unplanned brownouts and blackouts, as overstressed power distribution hardware failed, leading to more load on other parts of the network and more failures. The sort of thing that many dire Y2K warnings tended to be all about, in other words.
If you ask some people, information technology in general, and the Internet in particular, are significantly to blame for this sort of thing. Not primarily, mind you, but a large and growing slice of the pie. One pound of coal burned for every two megabytes of data transferred, don't you know.
This claim doesn't stand up to scrutiny. Lawrence Berkeley Laboratory has in fact comprehensively rebutted the above-linked "Greening Earth Society" study, which got a lot of publicity after it got plugged in Forbes magazine. You can read LBL's cogently argued response in Adobe Acrobat format here; in brief, the study was done by some people whose calculator seems to have a seriously wandering decimal point. The fact that the Greening Earth Society is bankrolled by the coal industry is, of course, neither here nor there.
It's now obvious to everybody who's been paying attention that the Californian power shortages had a lot more to do with a badly failed deregulation scheme and a fine selection of associated venality, as argued on Salon.com at the time, here.
But there's still an interesting issue here. The fact that this fundamentally broken study could attract so much serious attention, and that people are still quoting the darn thing today, highlights most people's ignorance about how much power various information technology whatsits use, and what that means.
When California was going dark, I was working at a big New Media venture. I could tell, because the decor was sort of like a Borg cube redecorated by the Play School designers, and if I looked out the window I saw the Sydney Harbour Bridge, and quite a lot of water. Dead giveaway, that.
I'm not there any more, by the way.
Anyway, that office was, at the time, heavily infested with desktop PCs running Windows NT 4. Not exclusively - there was a Mac or three, there were various Win98-using laptop owners, there was non-x86-compatible gear for serving up Web sites. And scurrilous rumours circulated about Morlocks downstairs who might possibly have a passing familiarity with some outlandish Scandinavian UNIX variant, but when I asked Microsoft about that, they faithfully assured me that whatever it was, it caused hair loss and impotence and should be avoided at all costs.
By and large, though, it was an NT4 shop. And NT4, she has no power management.
No monitor sleep, no system standby, no hard drive spin-down. Single-processor NT4 boxes will probably be set to send the CPU "HLT" instructions when it's idle, saving a little power. But that's about it.
Some vendors have managed to bludgeon NT into doing some manner of power management on their systems - particularly laptops. Computers with BIOS support for power saving technology can override what the operating system wants to do, to some extent at least. But none of the NT boxes in my office seemed to have any such niceties.
So when I left the office, later than most people, I saw a vast expanse of monitors showing either screen savers or the NT4 "press this button to restart" screen, which is exactly what they were going to keep doing until someone sat back down at that desk the next day.
That's something like 90 watts per screen, all night, every night. Another few watts for every spinning hard drive, another 20 or 30 watts for the rest of the computer. Say a hundred watts per PC, for simplicity's sake.
There are lots of workplaces that behave like this. It sounds like a pretty darn horrifying waste of energy, and in some ways I suppose it is, but not if all you care about is the balance sheet.
Compared with other bills - notably, what it costs to occupy a chunk of prime show-off real estate, never mind what it costs to keep it lit and cooled and full of allegedly productive people - the extra money offices commonly spend on things that're left on when they needn't be is no big deal.
This is only slightly because Australia has very cheap electricity. Residential power here costs a third of what it does in Japan and about half of most European prices. Industrial power's also about half price compared with Europe, and even cheaper compared with Japan. So you're only paying, say, 12 Australian cents per kilowatt-hour, tops.
But even if you were paying 50 cents per kilowatt-hour, a hundred left-on PCs each wasting a hundred watts would still only cost you five bucks an hour. Big deal.
There are other associated costs, but they don't work out at very impressive numbers either.
If you leave PCs and lights on in parts of your office that nobody's in, they make heat. Everything that does something makes heat, and that's a physical law.
You may have your air-con turned off overnight, but you'll need to pump that heat out sometime. Build a skyscraper in the middle of Siberia and it'll still need cooling if it's got normal populations of heat sources - inanimate or otherwise - in it.
If you have to wind your, say, 200 horsepower air-con plant up to full wellie at 0700 hours so the place won't still be toasty warm by the time the drones start coming to work an hour later, that's another 150 kilowatt-hours down the pipes.
But hey, at 12 cents per kilowatt-hour, that's only eighteen bucks.
Which is why nobody gives much of a toss about leaving a squillion fluorescent lights and half a squillion workstations buzzing away all night. If you can afford to have the gear in the first place, power costs are likely to be trivial.
Any time you actually make some computer-ish object, you incur an energy cost, but that's not too amazing, either. It's very difficult to add up the "power value" of all of the components and procedures that go into the manufacture and delivery of, say, a PC, but it's also very hard to defend a claim that they add up to more than a few per cent of its retail price.
So what you end up with is one of those odd situations like the old gas street lights, which burned all day because it was cheaper to keep 'em running than to pay people to extinguish and relight them at dawn and dusk. Or to come up with a self-switching version that didn't blow itself sky-high one day out of ten.
Nonetheless, you'd think there'd be not only a small financial benefit, but a bit of PR value, in not sucking down more power than you have to. Some companies have stringent power saving policies, after all.
If your workplace doesn't have such policies, though, don't hold your breath for things to change. Unfortunately, making people do something as simple as turning their darn monitors off, or turning off their whole computer when they're staring at a message telling them that it's now safe to do so, isn't as easy as it sounds.
Send an "Austerity Drive" e-mail to the whole office, and before you know it half of your bolshie employees are spreading rumours about how the company's obviously out of cash and everyone's going to be fired tomorrow, while the other half took exception to the tone of the message and are running fan heaters and arc lamps all day by way of protest. And you can bet that some people would keep leaving gear on because they believe it's most likely to fail on power-up, like a light bulb. That's not true for PCs, but it's a popular myth.
Of course, what power costs you, the consumer, isn't what it costs, full stop. There's greenhouse gas emissions and other power plant exhaust pollution, depletion of resources, destruction of wilderness by mines and their associated infrastructure, and more.
Most of Australia's electricity is generated by coal-fired power plants, which emit an awful lot of carbon dioxide. And coal plant fly ash contains radioactive uranium and thorium in surprising amounts. Even if the ash is effectively caught by filters, something still has to be done with it.
Not that fly ash scraped out of a filter and dropped into a bucket is actually amazingly dangerous stuff, but waste of similar levels of activity that happens to come from nuclear power plants is treated like pure megadeathium. You certainly can't get away with burying it in dams.
If all this makes you feel like a not entirely willing member of a plague of locusts, take comfort in the fact that things are going to get better. Emerging technologies promise to improve the situation.
For a start, places using dumb operating systems and old, non-power-management-friendly hardware are switching over to newer systems, which support the Advanced Configuration and Power Interface and its descendants. And which, more importantly, will be set up to use power management properly out of the box, and will probably fall back all the way to a practically-zero-power hibernation mode if left alone for long enough.
The power consumption of PCs when they're actually being used isn't falling much. CPU and coprocessor (graphics and sound chips, for instance) power consumption figures follow an odd zig-zag path with no immediately obvious short-term trend. Power per transistor is dropping, but transistor counts are streaking upwards fast enough to neutralise that. If you follow the graph back to the 386 days and before then it's got a clear upwards slope (back when I were a lad, CPUs didn't need a heat sink and fan...), but that doesn't mean we'll all be running 200 watt CPUs in two years.
Pretty much every computer component besides the CPU can be counted on to use at least a little less power with each new iteration, but compared with the rates of change we're used to seeing in computer specifications, there's nothing going on here.
Cathode ray tube (CRT) monitor power consumption's not falling much either, but much lower power flat-panel screens are becoming more and more affordable. If environmentalism and portability aren't your primary concerns then CRTs are still clearly better value than LCD panels, but since LCD screens consume a lot less power than similarly sized CRTs - a third or less - more widespread adoption of the things can only help. In the meantime, environmentalists can salve their conscience by buying a CRT monitor, and sending the money they saved over an LCD to the green charity of their choice.
Right now, you should pay no attention to people who fell for the Greening Earth Society study or anything else that says that computers are hastening the exhaustion of our energy reserves, blah blah. Even if you don't buy all of the arguments about information technology making organisations more efficient and thus reducing power use in other areas - and they're clearly true, in many cases - the numbers just don't support the idea that even whole buildings chock-full of computers are sucking much more juice than they would be if the space were used for something else.
We're still waiting for Mr Fusion to make all this irrelevant. In the meantime, though, the sky may indeed be falling. But information technology is, at worst, not making it fall much faster.