Atomic I/O letters column #84
Originally published 2008, in Atomic: Maximum Power Computing Last modified 16-Jan-2015.First it fills the 0 capacitors, and then it gets to the 1s
I remember the global unease with the idea that your new Compact Disc was actually read from the inside out. It was microwave cooking all over again! (Microwaves and inside-out cooking was the best global marketing scam EVER!)
Our conventional ovens still made a great roast and my LPs still happily trundled along from the outside in, with a comforting white-noise hiss (something else that went missing in the transition). And had done, since before I was a child.
Now we approach the end of another storage and playback medium. The hard disk drive, and its sweet, return to form, outside-in read/write method. On a round disk, too!
Flash memory approaches as the new bulk storage format. But it's not new - only the size and reliability are.
So my question is, for nostalgia, is there a physical/topology-driven method to memory address allocation in Flash RAM, or ordinary RAM? God knows you can't spin the things. Well, not with any real effect anyway.
Paul
Memory addressing was easier to visualise in
the olden days.
Answer:
The write pattern for Flash memory actually changes on the fly. "Wear
levelling" code in the firmware for all modern Flash RAM devices makes sure that
even if all you do with your 1Gb of Flash memory is write a 1Mb file to it, delete it,
write it again and repeat - actually, even if all you do is open the device with a freakin'
sector editor and write
over and over to the first blocks on the "disk" - you'll actually end up distributing
your writes evenly over the whole device.
This is to prevent repeated writes to one area from wearing out that area when the rest of the device is still perfectly fine, and it's why modern large Flash devices can be used for normal disk-drive sorts of tasks without wearing out.
You could do wear-levelling with hard drives too, except (a) it's not really necessary, since the parts of hard drives that wear out are not the magnetic particles on the platters, and (b) you'd end up with a very, very slow hard drive, as it did zillions of seek operations all over the place.
Wear-levelled flash RAM does lots of seeking too, but because it's a solid state device, that seeking takes nearly no time at all.
Ordinary RAM doesn't wear out like Flash, so it's addressed in a more straightforward way.
Basically, you can think of the chips on a memory module as being like a bunch of hard drives in a striped RAID array. The data for each write operation is split between all of the chips on the module. Rambus RDRAM (which hasn't exactly set the PC world alight...) works differently, and can concentrate the whole of an operation, and thus of the power and heat created by that operation, on just one chip on a module. That's why RDRAM modules needed heat spreaders.
Scrape it off with a chisel
I enjoyed your thermal grease comparison. In the process of installing a Cooler Master water cooling unit onto my CPU, I removed the CPU from the motherboard and got a little thermal grease on the underside of the CPU (the part that touches the motherboard). I quickly wiped off the thermal grease. I have not run the computer yet.
Do you think that that I have ruined my computer by getting some the thermal grease on the underside of the CPU?
Please answer me! I am desperate!
William
Answer:
Don't worry. It'll be OK.
Some thermal grease is metal-loaded, and conductive; it's a bad idea to have that on the bottom of a CPU, bridging contacts. Almost all thermal grease is non-conductive, though, precisely because it could otherwise cause havoc almost anywhere it happened to land.
If you've got a modern Land Grid Array (LGA) CPU, with little pads on the bottom of the CPU and little pins in the CPU socket, then it's possible that grease actually still on the bottom of the chip could interfere with the contacts. Really spreadin' the stuff like peanut butter onto the bottom of a standard pin-type CPU could cause similar problems. It's also possible that you static-zapped your CPU while you were wiping the stuff off.
But none of this is very likely at all. You'll probably be fine.
Naphtha (Zippo-type) lighter fluid is a good solvent for thermal goop, by the way.
In 1988, 700Mb took a month
I have a 256k-ish broadband connection, which is actually pretty great, where I live. But when I download files I get a scant 40 kilobytes per second, at best.
Is there a way for me to download files using all 256k? As in, not use most of it for nothing as I leave my computer on all night to download a 700Mb file?
Zeke
Answer:
First up, a "256k" connection is a 256 kiloBIT per second connection, not a 256 kiloBYTE
per second one.
256 kilobits is 256,000 bits, because transfer speeds are specified in powers of ten, not powers of two. A byte is eight bits, a kilobyte is 1024 bytes (which is a power of two, and yes, I am perfectly aware of the more-correct binary prefixes that aren't yet in popular use), and so a "256k" download speed means no more than 31.25 kilobytes per second.
Take transfer-protocol overhead into account on a 256-kilobit-per-second connection, and you can expect to download 25 kilobytes of user data per second. Maybe 26, with a following wind. At that speed, 700 megabytes will take eight hours to transfer.
You also can't get a file faster than the source - or sources - of that file want to send it to you. For P2P downloads of files that don't have tons and tons of peers, it's normal to only get a few tens of kilobytes per second, if that. The same applies for downloads from random servers out on the Internet somewhere; there can be more than one bottleneck between you and them.
Download manager software can fetch a file using multiple parallel transfers from the one server, which can in theory make the transfer faster. This "segmented" downloading seldom actually helps much, these days.
Most ISPs have local servers that host operating-system patches, game trailers and demos, Linux disc images and other such freely-distributable large files. If you can download the thing you want from there, you'll get it as fast as from anywhere, and it probably won't count toward your (explicit or secret) bandwidth allowance, either.
If downloads from a local server like that are still slower than they ought to be given the alleged speed of your connection, the problem is likely to be at your end. If you have a DSL connection, it's quite common for bad phone wiring - usually in the house - to greatly restrict the connection speed.
640 by 350 ought to be enough for anybody
I am trying to understand why there are a lot of LCD displays with screen resolution 1366 by 768, but HDTV resolution is 1280 by 720. (Doesn't matter if it's 720p or 720i, I suppose).
But the next step of HDTV, full HD, is 1920 by 1080, and some new LCD TVs really have this resolution. I believe that's the way it should be.
Why are manufacturers of LCD TVs pushing displays with resolution slightly higher than needed for displaying 1280 x 720?
Does it mean that there will be 24 black lines above and under the picture, and 43 black columns left and right of the picture?
With plasma displays things are even more strange - 1024 by 768 resolution is pretty common, but the screen's aspect ratio is 16:9, not 4:3.
Does this mean the pixels are not square?
Why this whole mess?
Dario
Answer:
I don't know exactly why the 1366 by 768 "intermediate" resolution is so common in the
LCD TV industry.
[UPDATE: I still don't really know, but thanks to a couple of readers I do now know more about this than is probably healthy. More at the end of this reply.]
1366 by 768 is a reasonably standard PC resolution; it's the widescreen version of 1024 by 768, and is sometimes known as "WXGA". There aren't many (or any...) large-screen WXGA PC monitors, though, so you wouldn't think the panel companies would have any particular need to make screens with those dimensions. Perhaps standardisation of screen driver hardware has something to do with it, or making it easier to drive LCD TVs from PCs, or something.
But no, there won't (necessarily) be any black bars if you view HDTV on such a screen. The image will be scaled to fill the screen, if its aspect ratio permits. 1366 by 768, 1270 by 720 and 1920 by 1080 all have the same 16:9 aspect ratio as standard widescreen TVs, so any image that fills one of those screens will fill any other. Modern scaling hardware generally does a good job of this, though you obviously lose definition when scaling down and don't gain any definition when scaling up. (Though some scaling hardware can do a pretty good job of guessing what a scaled-up image should look like, and sharpen the image accordingly to make it look at least a bit HD-ish.)
The resolution you can actually perceive on a screen varies depending on its size, its distance from you, and your eyesight. I talked about this a few years ago, when computers and HDTV were the only sources of high-definition video in the home, in this piece.
As I said then, if you take the gold-standard visible resolution for a screen three metres away to be no more than 30 dots per inch, and you've got a 1920 by 1080 screen, Pythagoras tells us its diagonal is 2203 pixels, which at 30dpi means the diagonal has to be at least 73.4 inches. Which is pretty frickin' huge.
If you've got a mere 40-inch 1366 by 768 screen at that same distance, you'll have 39 dots per inch, which is probably already above what you can see, even if you waste a bit of the resolution by upscaling a DVD onto it. There'll be little to no visible difference if you upgrade to a true HDTV screen of the same size; you'll either have to sit closer or make the screen quite a lot bigger for a difference to be noticeable.
And yes, the pixels of 16:9 1024 by 768 plasma screens are rectangular (and there are 1024-by-1024 models as well, which are even worse). The minimum feature size for plasma screens is quite large, which is why their resolution is always relatively low.
I don't know whether it's a feature-size issue that forces them to use rectangular pixels, but they often do. And there's nothing wrong with that, as long as they've got enough pixels in each direction to handle the highest-resolution video you want them to display.
Here's what a couple of readers helped me figure out about this odd phenomenon.
Modern computer graphics modes, as defined by the Video Electronics Standards Association (VESA), all have square pixels. In the olden days there were lots of oblong-pixel graphics modes, but they, and the computers that made them, are all now obsolete.
There are umpteen oddball computer-screen resolutions that VESA never actually themselves defined, but they're all descended from actual VESA standards, and they all have square pixels. Take the abovementioned "Wide XGA" (WXGA), for instance. There's more than one "WXGA" resolution; WXGA includes weird resolutions like 1280 by 800 and 1366 by 768. But all of these resolutions were created by widening IBM's XGA standard (800 by 600 or 1024 by 768) into a 16:9 or 16:10 display.
(Just to make things even more fun, PC video hardware often can't handle horizontal or vertical resolutions that aren't divisible by eight, for what I've seen very informatively described as "legacy reasons". So if you've got a 1366-by-768 screen, you may find that you have to run it at 1368 by 768, with one column of pixels falling off the edge of the screen on either side. Note that different flavours of MPEG compression, as for many kinds of digital video, have similar resolution restrictions.)
Modern TV video, however, is based on the ATSC Standards, which define several modes which don't have square pixels.
The ATSC 640 by 480 4:3-aspect-ratio mode, for instance, has the same resolution and square pixels as the old PC "VGA" mode. But right next to it there's a 704 by 480 4:3 or 16:9 mode, which gives you two different kinds of oblong pixels. And on it goes; there are four different 576-pixel-tall "Standard Definition" ATSC modes, each with an interlaced and a progressive-scan variant. Not one of those has square pixels.
This matters not at all for old-fashioned analogue displays, like CRT monitors, because they don't have a distinct pixel grid in the first place.
They just map image pixels onto their phosphor dots as best they can, by shining electron beams at the phosphors.
Modern flatscreen monitors, though...
...do have a hard pixel grid, and look fuzzy at best if you try to display an image on them whose resolution doesn't line up perfectly with their ranks of red, green and blue "subpixels". Old LCD monitors displayed resolutions lower than their native pixel count with a black border around them, or used super-ugly "nearest neighbour" interpolation to scale the image up to cover the whole screen. They didn't even try to display resolutions higher than their native pixel count.
Today, when a manufacturer produces a flat-panel display, they have to give it processing hardware that understands either the VESA modes, or the ATSC modes, or both.
It seems that pretty much nobody makes consumer, or even professional, display hardware that actually does understand both kinds of video mode. It'd be quite technically difficult to actually do this in the first place - not least because there's no good way to map an oblong-pixels format onto a screen with a hard grid of square pixels, and vice versa. It'd make monitors and TVs more complex to make and to set up. The people who care about this - like home-theatre-PC owners, for instance - aren't, as far as the screen manufacturers reckon, a big enough market to make it worthwhile.
So every flat-panel - and video projector, for that matter - is made to display ATSC modes, or VESA(-ish) modes. Hence, 1366 by 768 screens, which conform to one of the several "WXGA" modes but are nowhere very near any ATSC modes.
The simpler of the two resolution-standard options for manufacturers is the VESA-ish one, because VESA-ish pixels are always square, and there's no need to worry about interlace either. This, I think, is as good an explanation as you're ever going to get for flat-screen TVs that don't even necessarily have a computer input connector, but which nonetheless have computer-resolution panels in them.
All of this ought to fade into the past along with VHS and compact cassettes when high-definition television becomes the norm. The ATSC 16:9 high-definition display modes are 1280 by 720 and 1920 by 1080, and they do, thank goodness, have square pixels.
Computer displays seem to be standardising on the slightly squarer 16:10 aspect ratio, giving slim black "letterbox" bars above and below a 16:9 image (and, of course, much bigger bars if you're playing a properly widescreen movie). But that ought to be the full extent of the incompatibility between computer-panels and HDTV panels.