Atomic I/O letters column #22Originally published in Atomic: Maximum Power Computing Reprinted here 21-Jun-2003.
Last modified 08-May-2016.
Yesterday, we got this IBM NetVista P-III running WinME in at the workshop. It had an error message: "No Operating System Found. Please Insert a Boot Disk and press F1.", or something like that. Anyway, the BIOS could detect the HDD A-OK, but FDISK said that there were no defined partitions. The HDD was on the way out (it was clucking every now and then - about every 5 to 35 minutes), but the computer was still detecting it so the data should be there.
To make a long story short, I ended up typing FDISK /MBR at the command prompt to rewrite the hard drive Master Boot Record, after which the PC could boot fine. I cleared about five viruses off the hard drive before copying its data to a new drive.
My question is this: did I make the right choice in using FDISK /MBR? I don't know much about it and it seems to be largely undocumented. Could I have destroyed all data and partitions? What is the Master Boot Record anyway?
Rewriting the MBR when it's been trashed by viruses or other mishaps is The Right Thing To Do, for Win95-series operating systems (95, 98 and ME). In any case, it's unlikely to do them any harm. It can, and probably will, do harm to NT-series Windows flavours, including Win2000 and WinXP; for them, you need to use FIXMBR from the Recovery Console.
Conveniently, Microsoft have done the rest of the question-answering job for me, here; check out this page.
In Atomic issue 17, in the "Speed for the masses" article, you mentioned that when two Seagate Barracuda IV drives were RAIDed "...they scored far lower than a single Barracuda IV drive". What's going on here!?
I've got one 80Gb Barracuda IV, and was planning on buying another for a RAID 0 setup, but since reading Atomic's advice - "...if you plan to RAID, don't rely on the Barracuda IVs" - my hopes for increased speed were vaporised.
Has Seagate jibbed a few hundred dollars off unsuspecting customers by producing drives that cannot be used with RAID? Is there any way to resolve this problem? Has Seagate done anything? Do I have to live without the extra speed boost of RAID 0? Say it ain't so.
According to Seagate, the Barracuda IVs are just too darn fast for RAID.
Here's a helpful reference (including much Seagate we-are-cool-we-are-badasses-speak).
Are Ethernet crossover cables guaranteed to work in every situation?
Bear-Dog comes online one day saying that he has 2 network cards joined with a crossover cable, but they can't see each other on the network. In spite of network troubleshooting attempts, he is unable to get the machines going. I told him I am 100% certain that long ago, either Atomic or PC@uthority ran an article on small networks, and they touched on this very issue. The article basically stated something to the effect of not all cards supporting the use of a crossover cable. The reasoning (from memory) was something to do with the way the cards physically release the signals onto the network media.
The use of a hub or switch eliminated the problem, since these devices HAD to comply with some standard, but the cards did not. However I haven't been able to find any information about this particular issue.
This is one of those things that usually gets handwaved past the audience with a bit of mumbling about "timing issues". It's not curable; some NICs just don't like crossover cables, particularly at 100BaseT speed.
Pretty much everything on the retail shelves at the moment should be fine with a crossover cable, though, provided the cable isn't ludicrously long or badly made. This includes super-budget network cards. So if you've got problems with crossover cable compatibility, the simple solution is to just drop a dirt cheap new NIC into the machine.
I have been getting very frustrated with my home networking setup, as I am always moving cables and other networking paraphernalia around the house (not to mention to and from LANs) and it all gets very cluttered, tangled and just in everyone's way. After reading about different wireless setups I am considering going wireless. Is it possible to use some sort of amplifying equipment and an aerial to create a wireless network across a large area (eg a small town)?
Wireless isn't a bad idea, these days, since 802.11b gear can now be had for quite reasonable prices.
You certainly can set up a town-wide wireless network by using big fat signal amplifiers and matching antennas, but you'll get busted if you do; there are legal limits on the maximum power output of license-not-required equipment like this.
If you want long range wireless networking without broad coverage everywhere, though, you can do it without illegal amplification, by using directional antennas.
In Australia, a good place to find info about this is the Sydney Wireless site.
Atomic's comparison of heatsink/fans for the Pentium 4 and the Athlon was interesting enough but for one major exception, and that is your baseline room temperature. Only in the depths of winter do I have anything in the order of your 20 degrees C; in summer I'm lucky if the indoor temperature is in the mid 30's, and sometimes it's higher (outside is a lot hotter than that).
I generally find that I must use considerable effort to keep my computer stable in summer. Not all of us have the luxury of air conditioned buildings, so how about including some extreme conditions in your testing to give a more real world appraisal of what is being tested?
You can apply the numbers from the big comparison in Atomic issue 23 to any ambient temperature easily enough. Just take into account the difference between the ambient temperature you're dealing with - 45°C, say, for the temperature inside a computer case on a hot day - and our 20°C test temperature, and add it to the result. So a cooler that scored, say, 59°C at 20°C ambient, would score 84°C at 45°C ambient.
There's not actually a lot of point to doing this, though, because the results from Atomic's "Chernobyl" test rig (and, indeed, the test rig I use in my own cooler comparison) aren't directly comparable with CPU results. Like all "CPU simulators", including my own, the Atomic rig just provides a basis for comparison, so you can see what cooler's better than what other cooler and by how much. The only thing that behaves exactly like a CPU is a CPU, and different CPUs have different heat outputs, even before you start overclocking. CPUs are also inherently uneven heat sources, in computers that do real world tasks; they'll be hotter when they're doing some jobs than when they're doing others.
If you need a cooler for use in hot conditions, you should simply buy the best-scoring one you can lay your hands on. Provided the cooler's installed properly and the case ventilation is good, you should be able to keep your computer stable up to around 40°C room temperature without any trouble.
If your PC gets flaky when it's hot, bear in mind that the CPU cooler isn't necessarily the culprit. Overheated motherboard chips or expansion cards may be at fault, for instance.
A simple but less than totally elegant solution for heat-wave computing is to just take off the side of the case whenever the weather's really hot, and point a desk fan into the computer's guts.
After reading the article about GeForce 4 Ti4200s in "Holy Grail" (as Atomic is known to me and some of my friends) Issue 23, I thought overclocking my graphics card might be a good idea. I have a 128mb Asus AGP-V8420 GeForce4 Ti4200, running on a Gigabyte GA-6VXE7 mobo with a P-III 800MHz and 640mb SDRAM.
I downloaded PowerStrip, but then realised that I didn't know how far to safely push my GF4; being only 14 limits my cash inflow a little so I can't risk anything too much! That's why I decided to mail you 1337 people. What signs can I look out for that show that the GF4 is going too far? Any tips will be much appreciated!
You won't blow it up, at least unless you go hog-wild and start fiddling with hardware mods to boost the card's core voltage, and such. Otherwise, when you over-overclock a chip, it'll just stop working until it's wound down again.
When you overclock the GeForce4, increase your RAM and core speeds by small amounts - say, 5MHz at a time - and when you go too far, your computer will just hang. It'll probably be OK in 2D mode with the core speed wound up a bit too far, but will hang in 3D mode; over-overclocked RAM will probably give you noticeable twinkling-pixel image corruption before the computer hangs.
Once you establish the ceiling speed for the core or the RAM (fiddle with them separately), just reset the hanged PC, adjust the card speed to something a bit more conservative, and you're done.
My computer generates a lot of interference on AM720 (ABC Radio National, Mum's favourite station). How can I stop the interference (other than unplugging the computer)? Is it possible to shift the interference to another station? Or am I stuck with this stupid problem?
Clip-on ferrite beads are good for reducing RF interference from computer cables, but not below 1MHz.
You can't stop it, and you can't shift it, but you can minimise it.
There are lots of differently-clocked high-frequency circuits inside a PC. Most of them are motherboard buses of one kind or another, but expansion cards generally have their own clocks as well, and there are others; your keyboard, for instance, has an oscillator in it, and CRT monitors can produce plenty of RF. PC power supply units (PSUs) can squirt out a fair bit of RF, too.
Since PC oscillators generate a simple square wave (which is what you want, for a nice crisp digital signal), they also generate lots of RF noise. Square waves have strong harmonics, so you'll often find strong emissions at three and five times the frequency of a given oscillator (or at even higher odd harmonics), as well as at its base frequency. Cables connected to the PC can act as transmitting antennas.
A metal PC case will provide pretty good RF emission shielding; a plastic monitor casing probably won't, but it's not a great idea to take it off and spray conductive lacquer all over it if you're not very sure of what you're doing, so never mind that for now. Instead, first, experiment with moving the PC and unplugging things from it. You may find that one particular cable accounts for the 720kHz noise; you may also find that just turning the PC around greatly ameliorates the problem. Also bear in mind the inverse-square law; the strength of the RFI falls off with the square of the PC's distance from the radio, so moving the computer and radio only 40% further apart will roughly halve the noise problem.
Since the noise you're dealing with is relatively low frequency - 720kHz is way below the frequency of most of a modern PC's oscillators - clip-on ferrite beads are unlikely to help. I'll mention them anyway, because they are useful for quieting cables that are radiating above 1MHz, and they only cost around ten bucks each. Good electronics stores should have several types for you to choose from, including flat models made for ribbon cables. You just clip them onto the cable.
Cables with a bulge under the insulation at either end (or both) already have ferrites built in, but adding another one (or even several) can sometimes help.
Something that may help you rather more is plugging the PC into a good power filter. A cheap "surge/spike filter" power board won't change anything, but a decent uninterruptible power supply or line conditioner should remove any RF from the mains supply; low frequency interference has a good chance of being PSU-related. It's a good idea to filter your PC's power anyway, so that line faults can't fry it.
I am probably going to buy a Silverprop Cyclone 5 waterblock, Black Ice Extreme or Silverstorm BA radiator, Eheim 1250 pump and a couple of metres of Eheim 16/22mm tubing to water cool my 1GHz Duron (I will soon be getting an Athlon XP). I was just wondering if I would be able to use the coolant used in car engines with the above components, due to its higher heat conductivity. Or would it ruin them? Also, I'm not sure whether the waterblock will fit a Socket A Duron/Athlon.
Aussie outfit Silverprop's Cyclone 5 is as pretty as it is effective.
Car coolant doesn't actually have better heat transfer capabilities, generally speaking, than plain water. Some of the additives used in automotive coolant make it flow a bit more easily, which helps heat transfer, but most of the additives are there to prevent freezing, boiling and corrosion. Like pretty much anything else you can add to water, they reduce its heat transfer abilities.
That said, it's a good idea to put a shot or two of radiator additive in a PC water cooling system. Freezing and boiling aren't likely to be problems you need to address, but corrosion can still happen (especially in systems with water blocks made of one metal and radiator tubing made of another, as I discovered some time ago...), and the additive also ought to inhibit algae growth. Algae's not normally a problem in car coolant, because it gets so hot so often, but PC coolant can be a great growth medium.
The Cyclone 5 should fit any Socket A motherboard that's got mounting holes around the CPU socket.
Is it true that computer could over run the world? I dont believe so, but im still curiouse.
Yes. It is true.
The only way to protect the world from computers is to send them all to me.
I will keep you safe from them.