Dan's Data letters #9Publication date: 8-Nov-2002.
Last modified 03-Dec-2011.
I have built a large TV stand which incorporates two 12" subwoofers/amplifiers (I'm a bass-head every now and again). I use this unit to house my stereo and video gear, and my TV sits on top. I get some severe discolouration on the TV from the magnets. Do you know if there is a way I can shield the magnets so they won't discolour the TV?
I have found some shielding material in the US but can't find a place in Oz (I live in Canberra). Could you also tell me where to buy a degaussing wand from? I have been looking everywhere for one with no luck and my TV's auto degaussing doesn't. Any advice you might have would be great.
Retrofitting shielding to a speaker is possible, but not really practical. It's usually much easier to just move the speaker away from the CRT.
One way to do it is to put iron or steel containers - "tin cans" - over the magnets. That'll reduce the field outside the speaker box significantly, though probably not enough, if you've got big subs with big magnets. You can use multiple cans over the top of each other for more shielding. Making all of this work in a speaker box without it buzzing or falling apart is Not My Problem.
You can also magnetically shield speakers in the way that pretty much all cheap magnetically shielded speakers do it - just stick another magnet, preferably more or less identical to the one on the back of the speaker already, to the back of that magnet. This "bucking" or "cancellation" magnet should be reversed, so that it's repelling the existing one, which means you need to stick it on with strong glue to stop it from leaping off of its own accord. The external magnetic field of the resulting assembly should be considerably reduced. Add a one-layer steel cover over both magnets and the field outside the speaker box ought to be pretty trivial.
Retrofitting another magnet to an existing driver is very awkward to do, though, even if you can lay your hands on an appropriate magnet to do it with. And the dual-magnet assembly may screw up the driver's performance, because the field around the driver's voice coil will also change.
There are also specific magnetic shielding materials, but they won't be any use to you. They keep magnetic fields out, not in, so to stop your TV suffering purity problems, you'd have to wrap it in shielding material. You wouldn't have to completely wrap it (you could still see the screen), but this, nonetheless, isn't a workable solution.
Here's some more information on this subject.
Degaussing wands can be hard to find. Mine has an "Arlec" sticker on it, but they don't seem to sell them any more. Call your local TV repair person and ask where they got their wand from.
Note that degaussing will do you no good at all if the external fields are still there, which they will be if the TV is still sitting on the subwoofer.
And now, just because it's pretty, a picture of a monitor being degaussed.
I just found a Western Digital hard drive with a capacity of 200Gb. I thought IDE didn't support capacities higher then 120Gb? Is there an explanation for this?
The IDE capacity limit you're thinking of is actually 128Gb, if you're talking real, 1,073,741,824-byte gigabytes. Or 137Gb, if you're talking fake, 1,000,000,000-byte hard-drive-marketing gigabytes. It comes from the 28-bit address space that was all ATA had, until recently. 28-bit-addressing drives can have a maximum of two to the power of 28 (268,435,456) sectors, each 512 bytes in size. From which comes the abovementioned 137,438,953,472-byte maximum capacity.
The new, and excitingly named, "Big Drive" standard extends the address space to 48 bits. So the theoretical maximum capacity is now 1,048,576 times larger. Which ought to tide us over for a while.
For bigger-than-128Gb drives to work, you need an ATA host adapter that understands them (not hard to find these days), a motherboard BIOS that won't freak out (current BIOS revisions shouldn't), and also operating system and driver support. Older OS flavours may have conniptions over very large drives.
I have a two-fold question, but both questions go to answer the following umbrella question: Should I wait for a proper RGB-per-pixel CCD camera, or not?
1: I like to use programs like HDRshop to composite multiple images from my Nikon Coolpix 700 into nice High-Dynamic-Range photographs. I find the linear compression curve of digital cameras ugly-looking, but with an HDR image you can put a gamma curve on which looks even nicer than film, and then you can tell all those people who think film looks "warmer" to stick it. Also, HDR photographs are important as light probes for 3D rendering.
Now, sometimes my HDR assembled images have weird colour fringes which weren't in any of the LDR images. I am concerned that the weird one-sample-per-channel-per-three-pixels operation of current cameras causes problems when the LDR images are added together to make the HDR image. Is this the problem, and will it be solved by waiting for a new CCD?
Do you know of any cameras which record floating-point pixels natively, averting this problem entirely?
2: When you cut away the red and green information from a normal digital photograph, the blue spectrum is a hell of a lot more grainy than the red and green. It's yucky. Is this because of the camera's CCD sampling scheme, or is it chroma noise due to the JPEG compression?
I think the fringes in your pictures may be caused by slightly off registration - the camera or subject moved between shots. Even with a good tripod, it's possible for a very small relative position change to shift the image sufficiently that the camera's imaging hardware will change its opinion about the exact colour of an edge. Since the colour of everything but edges tends, by definition, to be pretty even from one pixel to the next, you won't notice such effects elsewhere.
The lower the resolution, the less of a problem this should be, but the Coolpix 700's 1600 by 1200 resolution is quite enough for tiny changes to make a difference. I presume you're triggering your tripod-mounted camera remotely, via serial cable; if you aren't, then your finger on the shutter button will certainly move the camera by a different amount every time. You should also be using TIFF image mode; if you're using JPEG, the compression will change the image's chroma values slightly differently every time.
You can see how the X3 sensor reduces colour error in the various blurbs about the SD9 (this one is good), but the camera's really only a curiosity, if you ask me. Audacious retailers are advertising the SD9 as a "10.3 megapixel" camera, on the grounds that it's got the chroma resolution of a regular digicam with that many pixels. But the thing outputs a 2268 by 1512 image, which most people will agree sounds a lot more like 3.4 megapixels. The SD9 also costs $US1800 - putting it in the same league as the six megapixel Nikon D100 and Canon EOS-D60 - and it can only take Sigma SA-mount lenses. It's got some other quirks, too, not least of which is a complete inability to save anything but uncompressed images; this review talks about them.
As far as raw pixel-value recording goes, any camera with a "Raw" image mode should, when you use that mode, record the native CCD element values directly. One problem with this is that for it to help at all, your HDR image creation software has to be able to load the RAW-format images; it's no good if you just turn the images into TIFFs on your PC before feeding them to the HDR software. HDRshop can apparently load at least some RAW format images, but the Coolpix 700 can't make them, so you're out of luck there for the time being.
Even the RAW images don't contain enough colour data for a true pixel-by-pixel RGB image, of course; the actual physical sensor of every colour digital camera so far except for the SD9 has an array of differently filtered one-hue-only pixels. So even if your HDR software works with the RAW images natively and only renders an RGB image at the end of all its dynamic range expansion calculations, it still faces the problem of figuring out what the filtered colour matrix of pixels actually represents.
Why's the blue grainier? Because digital image sensors, both CCD and CMOS, are less sensitive to blue light than to red (and are all quite highly sensitive to near infra-red, though some digital cameras have internal "hot mirror" filters to stop near-IR light from making it to the sensor). To boost the sensitivity, the blue-channel signal is thus amplified more than the green, which in turn is amplified more than the red; both signal and noise in the blue channel get amplified as a result.
Better cameras have a lot less blue noise. There are various "prosumer" digitals these days with pleasingly quiet results in all channels, at least until you start boosting their sensitivity setting above what their sensor actually delivers.
NetBEUI is a non-routable protocol. This is one of the reasons it is no longer used. (You should have seen the Microsoft campus network in the mid 90's [when I worked there] when they were just making the switch from NetBEUI to TCP/IP; any NetBEUI packet from anywhere in the company had to appear everywhere; network performance was worse than dixie cups and wet string; branches of the corp net would regularly disappear, only to reappear later when another branch vanished.)
My guess is maybe some of his machines were upgraded from Win9x and have NetBEUI set as their default protocol, which isn't getting passed through the switches.
While the term "switch" is not clearly defined, the thing that comes in a box from your local computer store with "Switch" written on the outside is not a router. NetBEUI will work fine over it.
Large networks that use NetBEUI are, indeed, miserably slow. NetBEUI is very fast for a two-node network; it becomes impractically slow around the 10 to 15 node mark.
I just recently read your information about serial numbers and credit cards [in letters column #4], which I found really interesting. Do you know if there are programs on the net to verify that the 3-digit credit card verification number matches the credit card number? Or can only banks do that?
Also on the topic of credit cards, my boss at work insists that when people order over email and provide their credit card details, they must send two emails, the first one with the first half of their credit card number, and a second email with the second part of their credit card number. Do you really think this is any more secure than just sending it in the one email?! I would have thought that if a hacker can intercept one email he would have no problem getting both of them!
Opinions seem to differ about how card verification numbers are generated - I think they're probably random, or at least entirely unrelated to the card number itself - but there's definitely no simple way to check them. (If there were, it'd defeat the purpose of having them in the first place.) There's more information about them here.
And, as you say, someone with access to your mail server (or to the server your customer's using to send the mail) could snaffle card numbers out of messages to you even if you insisted people send their 16-digit number in 16 separate messages. If someone doesn't have access to the server, then the only way they could intercept the message (as opposed to just sucking the info out of your database after you've received it) would be via some kind of intermediate traffic-sniffing attack, which is very difficult.
To my knowledge, no credit card number has ever been compromised by Internet traffic sniffing.
In I/O letters column #14 somebody asked you about the magnetic field of the PC speaker inside the computer case messing up hard drives.
On German watercooling fora, the newest buzz is that the magnetic field and the current induced by water pumps are harmful to computer circuitry and hard disks. Since I don't know enough about this to say whether this is true or just another "you absolutely need these copper RAM-coolers with a noisy 40mm fan" cr*p, I turn to you.
The most used water pumps in German water cooling rigs are by the company Eheim. A guy in a German water-cooling forum used the following setup:
Pump: Eheim 1048
Probe: 4 windings of 2 cm diameter copper wire
Sennheiser UPM-550 level meter.
Directly at the pump's casing (in mVeff)
On top: 4 mV
At the side: 7 mV
Back: 1 mV
Front: 0.6 mV
At a distance of 8cm
On top: 0.7 mV
At the side: 0.2 mV
Back: 0.2 mV
Front: 0.2 mV
Do you think that this is indeed harmful to a computer's innards? Most watercooling freaks in Germany now shield their pumps.
The electromagnetic radiation from a pump might be harmful to data, if you've got a nasty sparky brush motor in your pump. I strongly doubt magnetically coupled pump motors (see this review for more information on them) pose any threat, though. Their external magnetic field is laughably small compared with what's needed to wipe a floppy disk, much less a hard drive, and the induced voltages you mention are a long, LONG way below the one-point-something volts that's the lowest signal voltage that I think you'll find anywhere in a modern PC.
Unless something in the PC's really, really good at picking up these emissions, and really, really sensitive to noise, I don't think you'll see any effect.