Atomic I/O letters column #109Originally published 2010, in Atomic: Maximum Power Computing
Reprinted here September 18, 2010 Last modified 16-Jan-2015.
Me and my friend were having this little debate the other day. He thinks that you need at least a 650W power supply to power an average gaming system. Apparently any 650W will do. He even picked out an A-Power 650W for me!
What do you actually need to power an average gaming system such as a Core i5, 4Gb RAM, 2 HDDs, Radeon 5770, etc?
My friend went so far as to take me to the school IT guys, and they mentioned that graphics cards can chew up to 300W of power! I understand that high-end GPUs (that I can't afford) are like that, but he thought that an average graphics card can need 300W too.
I know this probably has been asked a lot, but could you guys possibly restate some information about efficiency, 12V rails, amps and such just for them?
There's no way to put a definite figure on the power consumption of an "average gaming system", because there are many possible configurations for such a system.
A computer like the one you describe, though, shouldn't be able to draw more than about 300 watts no matter what you do to it. Maybe 400, if you really push for some overclocking records.
But that doesn't mean that a "300W" PSU will be able to power it, un-overclocked, perfectly. For three reasons.
One: The current draw of your computer on the different PSU rails probably won't perfectly match the current-rating division of those rails on the PSU. Modern PCs generally need a lot of amps on the 12V rail.
Two: Even high-quality PSUs are often unable to deliver close to their maximum rating for long periods of time.
Three: Most PSUs are not high-quality PSUs.
An awful lot of PSUs aren't even fair-quality. They can't deliver anything remotely like the wattage number on the sticker.
A decent "650-watt" PSU certainly should be more than enough for the Core i5 system you describe. An A-Power "650W" PSU might be OK, or might not; A-Power, here in Australia, appear to be just one among many anonymous rebadge brands clustered at the front of the cheap-hardware phone-book.
(That's if the A-Power you're looking at is the same A-Power I see, with PSUs in the 700-to-900W range... for fifty bucks. Yeah, right.)
There's a reason why one "650W" PSU costs $50 and another costs $200. The more expensive one will probably weigh a lot more, last rather longer, be made in a much more reputable factory, and be able to get a lot closer to its sticker capacity.
At the very bottom end of the market there are PSUs that come from the factory with no power rating sticker on them at all. The retailer just sticks a "400W" sticker on the box if he wants to sell it as a 400W PSU, or a "700W" sticker if he wants to make a bit more money. Your typical dirt-cheap allegedly-500W no-name PSU today can confidently be expected to be good for not much more than 250W, and even at that load level it'll probably only survive a few weeks.
Fortunately, magazines like Atomic and all of the major review sites are well aware of this problem, and do proper load tests. This makes it quite easy to find the PSUs with honest ratings.
(There are a lot more PSU brands than there are PSU factories. A few top-flight factories, like "Channel Well Technology" for instance, make PSUs for several big brands. CWT make (or have made) Thermaltake, Antec and Corsair PSUs, among many others. Check out this page for a not-recently-updated list of many of the myriad brands in just the more honourable section of the PSU market.)
PC power consumption is, as you'd expect, usually dominated by CPU and graphics-card wattage. It's pretty easy to get even low-end CPUs to draw a big old slab of watts by overclocking; power consumption increases directly with clock speed, but also with the square of any voltage increase you have to use to get an overclock to work. So if you increase voltage by 10% to get a 20% overclock, power consumption increases by a factor of 1.1-squared times 1.2, or 1.45.
The Thermal Design Power (TDP) for the Radeon HD 5770 is 108 watts. No Core i5 CPU so far has a TDP above 95 watts.
The TDP is the most power a component is supposed to ever draw for any length of time. It's the power number that's used when designing the cooling system for the computer. You can squeeze CPUs and graphics cards up a little past the TDP if you really, really try, but even if you're running some distributed-computing application that keeps your CPU at 99% utilisation all day long, you're probably not exceeding the TDP. This is because different tasks light up different areas of the processor, and also hit the cache memory (which takes up a lot of real-estate on modern CPU) differently. Tasks that all apparently cause the same level of CPU utilisation can consume quite different amounts of power.
So taking TDP as a realistic maximum figure, a 108W-TDP graphics card plus a 95W-TDP CPU give a peak draw of 203 watts just for the CPU and graphics card. The whole system's realistic peak draw won't ever exceed 300W, and probably not even 250W.
And remember that this is a ceiling power figure. Real-world power consumption for this Core i5 system, even when you're playing a game and working both CPU and graphics card pretty hard, could easily average out at less than 200W, even after overclocking.
(There are also usually several differently-clocked CPUs in a given lineup that all have the same TDP. The slower models actually won't ever draw that much at stock speed; system designers just like to make sure a computer has enough cooling for the fastest CPU anybody's likely to upgrade it to. This includes chips that haven't even been released yet.)
There sure are 300-watt graphics cards out there, though, and you don't even need to buy a dual-GPU card. For a while now, the top-of-the-line single-GPU ATI graphics card has been the Radeon HD 5970, which has a TDP of 228 watts at stock speed. The top single-GPU Nvidia card when I first wrote this column was the GeForce GTX 285, which tops out at 204 watts; now that I'm putting the column on the Web, the GTX 480 has pushed that to 250 watts. Overclock these with gusto and you could get close to 300 watts, and probably comfortably exceed it with the GTX 480.
There've been several graphics cards over the last few years that at least push the 200-watt line. None of them have been mid-range cards like the one you're considering, though, and I don't think any mid-range card is ever likely to draw that much.
(Wikipedia have neat comparison pages for ATI/AMD and Nvidia GPUs, by the way, which give TDP numbers with the other specs. As you can see, the midrange cards have always had much less alarming power draws than the top-enders.)
PSU efficiency, like a lot of environmentalist issues, makes a difference to the planet but not much of a difference to the individual user. Many PSUs these days have excellent efficiency at a variety of load levels, but even if you've got a computer that draws 300 watts, 24 hours a day, then a 70%-efficient PSU will draw 429 watts from the wall, while a 90% efficient one (which will pass the 80 PLUS "gold" test) will draw 333W. Even if you pay a quite high (for Australia) 20 cents per kilowatt-hour, the less efficient PSU will only cost you about $42 more per year. And you probably, of course, don't have a 300-watts-non-stop computer.
One other tip: It's a good idea to have a spare PSU sitting on the shelf, on account of power supplies' well-known tendency to die at the start of a long weekend. Even if your standby PSU has a considerably lower power rating than your in-use one, it should still be able to run the operating system and 2D applications like e-mail and your Web browser. This is a heck of a lot better than nothing while you wait for a warranty replacement on a popped PSU.
My MP3 player has a cracked screen. Except it hasn't.
The screen IS definitely cracked (don't know how I did it), but every time I've seen a picture of a cracked LCD it's gone all crazy with the whole area around the break completely illegible, and my screen is still readable. It just looks as if I'm reading it through a cracked car windscreen.
Does this mean I can fix the screen without replacing the whole panel?
Yes, it is at least in theory possible to fix the screen, without buying a whole new LCD module.
What you've managed to do is crack the glass cover over the panel itself, without cracking the panel. So all you need is something to replace that cover.
There are a heck of a lot fewer LCD panel sizes than there are LCD panel types, so the glass from some other panel may work fine. But you just try finding separate glass that's the right size.
Fortunately, your MP3 player probably doesn't have any fancy coatings on its glass, or, if it does, you won't really mind giving them up. In that case, you could probably just go to a glass shop or picture-framing place and get some thin glass cut to the right size. (You can even get anti-reflective glass, made for museums and art galleries.)
Glass replacement is completely impossible for a lot of LCD panels, though, because the relatively few monitor and laptop LCDs that have glass on the front at all (rather than plastic), have that glass bonded solidly to the panel's sandwich construction. There are a few exceptions, like the iMac monitors that have the glass held in place by magnets around the edges, but usually it's more like the "unibody" Macbooks, in which the whole screen-and-lid assembly isn't meant to be repairable at all.
Note that you also can't replace just the glass on most touchscreens, like - to stick with this impromptu Apple motif - the iPhone and iPod Touch. The touch sensor is bonded to the glass, so you have to buy both together.
All these disclaimers aside, though, the glass covers on small LCD screens (cameras, media players, digital photo frames...) are often quite easy to remove, with a hair-dryer to soften the adhesive and a suction cup and/or sticky tape to pull the glass off. You can often even buy an actual official perfect-sized replacement part, either brand new or parted out from a dead device.
(Some people replace the glass when it's not broken at all, because they just can't abide a reflective glossy screen, and want to replace it with something that has a matte coating. There are also people who go the other way! I think most matte screens don't have glass on the front at all, while most glossy screens have a glass front that's permanently bonded to the LCD sandwich. Trying to matte-ify such a screen with add-on plastic film is not a good idea.)
A while ago, I had to burn a DVD-RW for use on a Mac, from an image file. The image file name ended in ".dmg", which nothing in Windows could cope with, but I read on a forum that you can just rename a .dmg to .iso, and it'll work. It DID work!
Yesterday, though, I had to do the same thing with a different DMG, and it didn't work at all. I might as well have been renaming an AVI or a MP3 to ISO and trying to burn that.
Has the DMG format changed? What's going on?
DMG also, however, supports compression and encryption, on top of ISO. If a given DMG uses either of these extra features, the file becomes undecipherable for anything that doesn't support DMG.
There are a number of ways to convert these DMGs to ISOs on a PC, but I think they're all either painfully inelegant, or require commercial software. So if it's at all possible, try to translate the file on a Mac. OS X's "hdiutil" console command makes this pretty simple.