Dan's Data letters #1Publication date: 25-Sep-2002.
Last modified 03-Dec-2011.
I remember reading a year or two ago about the effects of vsync and how it worked. Basically, what I seem to remember is that having vsync turned on in games set to, for example, 1024 x 768 @ 60Hz, on a 10GHz Pentium 4 and a Geforce4 Ti20000, means your frame rate won't exceed 60 frames per second. I'm quite sure about that part.
Now let's suppose you have an Athlon 900 and a GeForce2 MX, running at the same resolution and refresh rate, in the same game. Let's say that theoretically, this setup would be able to draw 56fps.
I remember reading that if your computer is not able to provide the 60fps that's necessary, you would not get the 56 fps that your system is capable of, but instead a frame rate which corresponds to the next lowest division of 60Hz which is equal to or below to the fps that your system is able to deliver. So for example if you are running around in an empty level, and your system can happily render 70 fps, your fps will be 60. Then more people join, and your system can only render 50fps, now your fps is 45 (just a guess here). Then a big firefight breaks out, and your system can only render 42 fps, now your fps is 25 (again just an example).
Does that ring a bell, or am I barking up the wrong tree? :)
No, that's not how it works.
Think of it this way. If your refresh rate is set to 60Hz, and vsync is on, then your computer will always display exactly 60 frames per second, but some of them will be identical to the previous frame if the computer hasn't managed to draw a new frame in time. It doesn't matter if a new frame's ready one millisecond after a frame starts being drawn; it's missed its chance.
So if the system can only draw 56 frames per second, that's what you'll get, but it'll be a bit uneven - the actual signal going to the monitor will be exactly 60 frames per second, regardless of whether those frames were the same as or different from the previous one, and so the amount of time a given image can be on screen is always some multiple of 1/60th of a second.
With Vsync turned off, the output to the monitor is still 60 frames per second, but now every frame starts getting drawn on the screen as soon as it can, with no regard for whether the last frame's been finished or not. If another frame is ready when the top half of the previous frame has already been drawn, then that particular frame on the monitor will have the top half of frame #28372 and the bottom half of frame #28373.
I've read your webpage concerning the TEAC CD-C68E. I also love the drive, but I had to upgrade my operating system to Windows XP and I lost the capability to access five of my six CD slots. Do you know a way to use all the slots with Windows XP?
There's no elegant way to do it. I mention this issue in passing here. ATAPI CD changers just don't work properly in Win2000 or WinXP, and as far as I know, there's nothing you can do about it.
Just read your latest I/O letters column and I thought I might chuck in my two bits worth on the post shutdown cooling issue. You've titled the letter "CPU turbo timer", I've had turbo timers on cars before and the comparison doesn't quite stick. A turbo timer on a car actually does the opposite of what Nic Beavis wants his device to do. A turbo timer keeps the engine idling for a while in order to keep the turbo housing from cooling down too quickly, avoiding a rapid temperature change which could lead to variable expansion rates cracking the housing. Keeping the fans running after shutting your PC down would shorten the cool down period, leading to an increased risk of differential expansion damaging the CPU.
It's my understanding that the purpose of (real) turbo timers is to keep oil flowing through the turbocharger as and after it spins down; without them, the very hot turbo gets no fresh oil, and cooks the oil that's left in it. The turbo can be damaged as a result, either because it's still spinning with lousy oil on its bearings, or because it gets gummed up with burned oil.
The fact that the continuing flow of exhaust gas during the timer period also warms the turbo is, I think, just a side effect; if anything, I'd think the continuing flow of oil will cool the turbo faster than would happen if you just stopped the engine.
Regarding differential expansion and its possible risks, I've received some other correspondence about this issue, but I'm unconvinced that there's a real risk here. Cooling the CPU faster won't do it any good, but I don't think there's evidence to suggest that it does any significant harm either, as long as you're not dumping liquid nitrogen on it or something.
I'm trying to format my 80Gb hard drive on my new P4 2.4GHz so I can dual boot to Windows XP and to Linux. I've heard that formatting a disk often wipes all the data on it, which would be a Bad Thing, but newer partitioners don't. The default XP partitioner doesn't say whether or not it wipes data. The Microsoft site naturally only has information on using two Windows flavors on the same HD, but says nothing about using other OSes. Can I simply use the XP partitioner, or would this cause lossage to files?
Repartitioning a drive, normally, does wipe everything on any partition you've changed. You need special "partition management software" to change partition sizes without losing the data on the partitions; WinXP's Disk Management utility can't do it.
Partition management software can juggle data around so that as long as you don't have insufficient disk space for the new partition and data arrangement, then you can indeed repartition without losing data. The big name in this field is Partition Magic. Linux users can, as usual, do much the same thing for free; Parted seems to be a good option, and offers bootable floppy versions that anyone can use, whether or not they're interested in Linux per se.
I just read your recent SATA article and noted that you constantly refer to megabytes per second as Mb/s. I always that that Mb/s meant megabits per second and that MB/s meant megabytes per second. The big B little b was to distinguish between bytes and bits. I also find it hard to believe that you are wrong about anything. Could you please let me know if I am crazy or if you are in error?
Actually, the balance of opinion among people who bother to have arguments about this sort of thing is that bits get the upper case B. Well, that's the balance of opinion among that contingent of those people who are more likely to have beards and remember toggling boot code in on something's front panel. General usage seems to have swung back the other way, these days, except that the most popular abbreviation for "megabyte" is still "Mb".
Why all the confusion?
Well, on the one hand, bits came first. The byte doesn't even have a definite size - it varies depending on computer architecture. So, the argument goes, bits got the abbreviation B, and the byte had to make do with b.
On the other hand, bytes are bigger.
The gripping hand is, though, that this whole area is so hopelessly jumbled that you have to define your terms every time you use them. People abbreviate "megabyte" as "Mb", "MB", "M", "M-byte", "Meg", and God knows what else. It's a complete schemozzle.
Personally, I abbreviate bytes as b and bits as B, but I'll still define my terms every time I do it. If I remember.
Do you know what the limit is for SVGA cable length? (and PS/2 keyboard and mouse cables, for that matter).
I'd like to move my 747 soundalike PC waaaaay over there, but I imagine there's some sort of limit. Also, if I exceed said limit, would it just not work, or will everything explode, or something in between?
The length limit depends on the quality of the cable. You need good cable for a run of any significant length, and you need quality connectors for any extension at all, unless you're happy with ghosting any time you use a high-ish resolution.
Realistically, the limit for monitor extension cables without expensive repeaters and such is around ten feet. Get a good ten foot extension lead (Belkin brand, for instance), and you should be fine up to quite high resolutions and refresh rates.
If you overdo it, nothing should come to harm. Over-long monitor cables just give you a crappy image. Over-long PS/2 cables may give you intermittent control. Nothing ought to die, in either case.
What's the secret to getting faster frame rate from a video camera (at a cheap price)? I want to do crisp clear slow motion (I understand shutter speed is part of the deal), but I am just not getting enough frames per second to see what I want to see.
As far as I know, there's no cheap option. Moderately high frame rate video camera setups (like, 120 frames per second, not ultra-fast explosion-tracking stuff) seem to cost well over $US5000, because they have to use special capture hardware; consumer capture boards won't work with them. On the plus side, they're not totally exotic equipment; apart from scientific applications, I think they're used for some industrial process control/robotics purposes, which means there are probably quite a lot of them out there, and you might be able to find an older but adequate rig on eBay, or something.
If you need colour as well as high frame rate, though, I think you're likely to be back into multi-kilobuck territory.
I'm always curious when video card reviewers mention we're "not yet at Toy Story quality". Anyway, I always wondered if the best PC hardware (P4 3GHz, NV30 or whatever) could match a $50000 SGI Octane2 or Fuel workstation (or whatever they use at Pixar Studios anyway) in OpenGL games or Benchtest demos.
For games and game-type benchmarks, PCs do very well indeed, and are super-monster-ultra winners compared with pro graphics workstations (most of which can't actually run PC games, of course) if you take price into account.
The big deal about the workstation solutions is that they're able to use their hardware for something useful; they don't have the performance-for-accuracy tradeoffs that the PC cards have (have you ever seen a line of white pixels along an edge in a 3D game, where the polygons don't meet up? That's an error that's tolerated because it's faster to be a bit sloppy, but it's obviously unacceptable for pro work), and they're also able to take their hardware-rendered images and rapidly pipe them back to main memory, so the rendering hardware can be used to pump out frames for use somewhere else. Which is, of course, what you want when you're rendering a movie instead of a game.
PC cards can be used for graphics-hardware-to-main-memory purposes as well, but their drivers really suck in that area; there's no value to it for games, so it's been completely ignored, and in any benchmark that tries to do it, PC cards will lose by a mile.
How much would a 2 by 2 by 1/5 inch (approximately) neodymium magnet cost that could pick up paper clips from 6-8 inches away?
It'd cost quite a lot, since I don't think there are any surplus sources for NIB magnets that size (that's considerably bigger than anything that I've ever pulled out of a hard drive). See here for about the biggest cheap NIB I've found.
GREETINGS FROM PUERTO RICO
HOW ARE YOU? HOW YOU DOING? IT ALL FINE? EVERU THING FINE?
HOW DO YOU FEEL IN AUSTRALIA? NICE TO MEET YOU MI NAME IS"
HIRAM MERCADO I'M FROM" PUERTO RICO, I LIKE THE COMPUTERS
I'M VERY FANATIC OF COMPTERS, I LIKE THE TOWER BLACK,
I'M INTERESTING IN THE TOWER BLACK, HOW MUCH THIS MODEL
OF TOWER BLACK, THE NUMER OF THE MODEL IS PC31,
THE MODEL THAT WANT IS TOWER BLACK PC 31, HOW MUCH WIHT THE
TOTAL OF SHIPHING AND HADLING? THE TOTAL COMPLETE OF THE TOWER BLACK
PLEASE SEND ME ALL INFORMATION IN MY E-MAIL
ATT" HIRAM FROM" PUERTO RICO
This guy sent me three copies of this. He got my usual "sorry, I don't sell things" reply. I don't bother replying to the credit card scammers who regularly e-mail me to try to buy 100 CPUs, but this chap didn't sound like one of them.
You cannot tell me that this is just someone who doesn't speak English too well.
I don't know what he's on, but I think I want some.