Atomic I/O letters column #63Originally published in Atomic: Maximum Power Computing Reprinted here November 2006.
Last modified 16-Jan-2015.
This should really be a pretty simple question. Basically, these days you can get LCDs with response times of 4ms (in the case of the Viewsonic VX922, a claimed G-to-G of 2ms), much faster than the old standard of 16ms. However, I was wondering just how much these newer response times are pure hype.
I mean, ignoring the Black to White/Gray to Gray standard measurement stuff that's being used to fudge the results a little, the reason I ask is that a standard CRT (in a disgusting office block where the company hates your eyes) refreshes at 60Hz, yes? So it would have a physical "response time" of 1/60th of a second, or 16.6ms.
Now, one can't see ghosting at all on a CRT, yet would-be gamers insist that a 16ms LCD will show noticeable ghosting. I'd assume this would simply be down to the fact that on a CRT the entire picture is refreshed, as opposed to only a set of moving pixels on an LCD?
I guess the actual question here is whether you can actually compare a response time to refresh time in this manner, or if I'm barking up the wrong tree?
As you say, CRTs redraw the entire screen every time they refresh. The electron beam zips over the phosphors and draws a whole new picture, even if that picture's largely, or entirely, the same as the old one.
In the olden days, a lot of CRTs had high enough "persistence" that they actually did ghost, very badly. You can see this effect clearly on an Apple II green screen with text scrolling on it, for instance. Persistence is how long the phosphor keeps glowing even if the electron beam doesn't shine on it, and it could be the thick end of a second (!) on some monitors.
Since there weren't any action games with detailed full screen animation for those computers (don't even ask about video playback), and since high persistence eliminates flicker no matter how lousy the refresh rate, those screens were a good solution at the time. Modern CRTs have very low persistence, and thus pretty much zero ghosting, as you say.
Modern LCDs update the screen in a more straightforward way. The image gets dumped to the screen all in one go each time it refreshes, and all of the subpixels (and the pixels that're made up of subpixel triplets) change at once (if they need to).
The difference is that all of the subpixels only actually change at once in Physics Experiment Land. In reality, they all start changing at once, and the monitor's hardware does its best to get them all to change as fast as possible. Some, if not most, modern LCDs deliberately overdrive grey-to-grey subpixel changes to get them to shift intensity faster, then chop off the drive voltage when the subpixel's got to pretty much the right place.
(Think of the subpixel intensity as being like a lead ball being pulled around by a rubber band. The other end of the rubber band represents the drive voltage. Small changes take longer, because the band isn't stretched as hard, but you can stretch the band far to get the ball moving, and then move the band-end back to where you want it. Fortunately, the smaller the change, the less visible it is. That's why you see ghosting for medium-sized intensity changes, not big ones or tiny ones.)
So, let's imagine a CRT and an LCD with the same 16ms (say) screen refresh rate, but let's say the LCD also has a 15ms (to avoid confusion) grey-to-grey response time. We've got one screen that changes all of its pixels pretty much instantly when the electron beam sweeps over them, every 16ms, and another screen that starts changing all of its pixels every 16ms. But the LCD screen may not finish changing a given pixel until 15ms later, barely in time for another screen refresh.
Hence, ghosting, and a reason for rabid gamers to upgrade to LCDs with 0.04ms response times.
We have, by the way, finally reached the point where big CRTs cost more than similarly specced LCDs. A desk-hogging, hideously non-portable, non-flat "21 inch" (20 inch viewable) CRT like, say, a Samsung SyncMaster 1100p Plus, now only costs about $AU800 delivered (if you can find someone who'll sell it to you - a lot of computer stores don't stock CRTs at all any more, let alone huge ones with a lousy three-figure price).
But the cheapest 20 inch 1600 by 1200 5 or 6ms LCDs - are now well under $AU600. now only about a hundred bucks more. There's no reason to upgrade from a 21 inch CRT to a one of those if you're a gamer (fuzzier pixels certainly don't hurt; that's what FSAA's all about), but if you're not a colour accuracy fiend, the CRT is at last fading into history.
(As I write this, Dell are clearing out 20 inchers quite a bit cheaper again, and selling their 30 inch 3007WFP behemoth for only $AU1999. That's because there are newer versions on the way, though.)
Could you please explain these RAM settings?
My ASUS K8VSE Deluxe Socket 754 mobo BIOS has, under "ECC Configuration":
DRAM ECC Enable
MCA DRAM ECC Logging
ECC Chip Kill
DRAM Scrub Redirect
The options for these settings are all "Enable/Disable".
Dram BG Scrub
L2 Cache BG Scrub
Data Cache BG Scrub
These options have settings ranging from 40ns to 655.4us.
What should they be set to? Do you have to have ECC RAM to make use of them?
None of these settings will do anything if you don't have ECC RAM, which I presume you don't. There's no reason to pay extra to get the slower ECC memory if you're not running a server, render-box or scientific computing machine that has to have maximum stability and total data reliability. More the first than the second, too; individual flipped-bit errors that actually matter usually cause some other obvious error, and don't just result in insidious bad data being written to disk.
Incidentally, it's perfectly possible for RAM errors to not matter. A transient error in a piece of memory that's not being used for anything, or that's being used for something trivial (Winamp's data buffer, say), won't do any harm.
Now, assuming you've got ECC (Error-Correcting Code) RAM, the "ECC Enable" setting is what turns its basic functions on and off. With ECC off, the memory will behave like ordinary non-ECC RAM, which is faster than ECC.
"DRAM ECC Logging" lets the BIOS write its own event log to disk for RAM errors.
The marvellously-named "ECC Chip Kill" slows the RAM down even more, but allows multi-bit error correction.
The "Scrub" stuff allows corrected values to be written back to the RAM - which, yes, slows it down yet further. "Scrub Redirect" writes the corrected values as soon as the error's detected; the "BG Scrubs" all do it when the memory's idle instead (in the "BackGround"), and the timing values determine how often.
If you need ECC memory and have therefore bought some, you probably want all of these options on. Unless you're restarting your important server and intend to play some maximum-frame-rate games on it, or something, in which case you should turn ECC off entirely for the duration.
I usually have about a zillion messages open in Eudora, while I wait for workmates and clients to be fired or die, and absolve me from responsibility for replying to them. Keeping the messages open makes them think I still care.
The other day, though, I tried to open Eudora when I was running a few other memory-hungry apps and it just went "bing" and crashed. When I closed the other apps and ran Eudora again it started from scratch - now no messages or mailboxes are open.
This is terrible! I look as if I'm doing as little work as I really am!
I've got backups. I tried running Eudora from one of them, and it worked fine except for complaining a bit about being in the wrong place. There was my big line of little message icons again. But the most recent backup is from a week ago, and since then I've received important porno spam that I don't want to lose.
Is there an index file that stores the open messages, or something? Is it a registry thing?
Yes, there is such a file. No, it's not a registry thing. Eudora is one of those old-style apps that keeps everything in a cryptic config file, eudora.ini, in its program directory.
Restore just Eudora.ini from a backup, and your clutter and slow Eudora startup time will be back, without any new messages being lost (though they won't, of course, be open).
I've been setting up a bedroom studio based around my Athlon 64 X2 WinXP gaming machine, with the absolute minimum hardware (because buying the computer emptied my bank account :-).
I don't need multitrack in or out, since I'm using softsynths (Hurray for Reason!) for almost everything but vocals, and I've got a little USB line in thing that works fine whenever I need to record something.
I'm using a USB DAC and a home theatre amplifier as my audio output system, which works great (for DVDs, too!), except when I want to play something into the sequencer (via, of course, my USB MIDI interface). Then, and only then, it becomes obvious to me that the USB audio output is laggy.
I've read up on latency, and I kind of understand why it happens, but it seems kinda stupid that my super-computer can't output audio less than 16 milliseconds or so after MIDI tells it to play a note. That's too much lag to monitor properly.
There are "pro" sound cards with low latency ASIO drivers, and some of them are almost affordable. They've all got multiple ins and outs and other extras that I don't need, but I guess I'm stuck with them. Which one should I get? Or should I get some (gasp) Creative card with even more bells and whistles that costs less and is also supposed to be low latency?
Rejoice - your salvation lies not in hardware, but in more software.
But don't rejoice too much, because you're probably going to have to pay for the software that'll solve your problem.
(Actually, what you should do first is make sure your home theatre amp isn't set to some processing mode that injects its own delay.)
If that's no good, go to the USB Audio site and download the demo version of their low latency USB audio driver. It goes "bing" every 30 seconds but is otherwise fully functional, and it will allow the lightning latency you need. Down to a screaming (figuratively speaking) four milliseconds for 44.1kHz audio, actually, as opposed to the buzzing (literally speaking) 14ms you've probably already experienced.
The USB Audio ASIO driver runs in kernel mode, so you'd expect it to perform amateur open heart surgery on Windows, but the installer is actually extremely civilised - you don't even have to reboot. And yes, if you decide it's not for you, you can uninstall it (well, I could, anyway) without ending up with 103 identical broken sound cards in Device Manager.
The down side? The USB Audio driver seems to rather enjoy leaving unkillable zombie copies of huge softsynth applications sitting in RAM and preventing newly run programs from outputting sound until you reboot (well, it does on my system anyway, darn it), and the full version costs 44 Euros.
That's still cheaper than a new high quality sound card, of course. You might like to consider a new consumer sound card to use with ASIO4ALL instead, though.