Dan's Data letters #152Publication date: 11-Oct-2005.
Last modified 03-Dec-2011.
A friend of mine found this review of a little amplifier and dutifully passed it along. I was wondering if you'd heard of it and whether it was legit. The review sounds a little too excited (or horny), which made me suspicious...
Yes, there's apparently been a lot of buzz about the exceedingly cheap Sonic Impact T-Amp in the audiophile community.
On the one hand, anything that 6moons likes is likely to be overpriced ("Only $99 for a 6' power cord?") tweaky audiophile nonsense (there's really nothing they won't fall for; they're the ones who believe "experts" who say that odd numbers sound better).
The 6moons guys, like a lot of other exceedingly enthusiastic audiophiles, exist in a magical alchemical universe where subjective experience is supreme, and actually superior to repeatable science. Which raises some questions about why anybody with this attitude to the world would bother writing reviews of stuff, but no matter.
On the other hand, the T-Amp is cheap by anyone's standards, although you have to give up a lot of modern conveniences (like proper connectors) in return. There's a newer, more expensive version that doesn't have these problems.
I'd say that either one is a perfectly safe buy, given how cheap they are.
I'm tempted to get one and set up a modest little music system in the spare room. The T-Amp's low power output isn't really that big a deal, since the logarithmic nature of human hearing means most hi-fi listening is only done at a couple of watts per channel anyway, but you still shouldn't expect it to be able to drive inefficient speakers in a big room to acceptable volume. If you're wondering what an inefficient speaker looks like, the most common kind is anything with a sealed box. Ported speakers, all things being equal, are considerably louder for a given power input. Small speakers are also generally less efficient than large ones.
Because I'm not a writer for 6moons, however, I doubt the T-Amp will impress me with its "rounded character", "emotive ability" or "rhythm and pace", or that I will conclude that "the space around and between the instruments" isn't quite as "black" as that of a single-ended valve amplifier (which is also apparently more "fragrant" than the T-Amp).
But hey, we're all only one hit of blotter acid away from 6moons' wavelength.
I just choked down a $300 electric bill last month here in Houston, TX. Temps in the high 30s and a wife that likes the A/C set below 20 do nothing to help.
Then I read this.
Is it possible to power most of a home's electric needs from solar power? Or is it just for low-power devices like lights and gadgets?
It sure is possible to go all-solar, but unless you get a big fat government subsidy to do so, the price (and ongoing costs - occasional equipment failures, periodic battery replacement) is still prohibitive for pretty much anybody in a Western nation with access to grid power.
Average power consumption in Western houses varies considerably with location - somewhere badly insulated in a cold or hot place that uses electricity for heating or cooling (and there's not much option, for cooling) will obviously suck down a lot more juice than a place that for design and location reasons doesn't need much heating or cooling, or which has gas heating and cooking, et cetera.
A pretty frugal American house is likely to need an average of, say, 20 kilowatt-hours per day (you can figure this out for yourself by reading your electricity meter, of course). 20kW/h over 24 hours is 833 watts; that's the average power your alternative energy source will have to provide. If it's solar, though, you're only going to get its full rated power in noonday sun, and you're not going to get anything during the night, and charging your batteries won't be anything like 100% efficient, and the inverters that give you mains power from the batteries' low voltage DC won't be 100% efficient either.
So, most likely, if you don't install at least "4000 watts" worth of panels to serve your apparent 833 watt load, you're not going to get enough juice.
If you get a bargain on used solar panels, 4000W may cost you only $US8000. Add batteries and inverter and you're unquestionably going to be over $US10,000; if you don't install it all yourself (which it's probably at least partially illegal for a non-electrician to do where you live), add a few grand more.
Now, if your electricity bills are $US300 a month year-round, fifteen grand is actually not that outrageously expensive. You're certainly not going to be paying anything like three hundred a month to replace tired batteries.
If the bills average out at rather less across the year, though, you could be waiting ten years before you break even.
Go on an environmentalist rampage through your house replacing every juice-sucking incandescent light bulb and "power vampire" linear plugpack and you can make the whole situation less scary. Obviously, you should also make sure the joint is properly insulated. But there's only so much that can be done to many houses.
Using alternative power - solar or wind, usually - to supplement mains power is more feasible, especially if it's possible for you to pump your unused output back into the grid and get a rebate for it. This sort of setup needs no battery bank and can be pretty much as small as you like, and if the whole thing fails you're not sitting in the dark.
Dropping off the grid altogether, though, is still only a sensible option if you've got no choice, or if someone'll pay you to do it so they don't have to build more power plants.
I am currently in the middle of a somewhat massive house renovation in the UK (where I moved to from NZ about 8 years ago). As you probably know it gets mighty dark here (especially in the North) over the winter months. My new kitchen is to feature recessed downlighters/spotlights as task lighting over the kitchen work surfaces.
At first glance this would not be a problem and halogen spots would do. However the house (as with most UK houses) is multi-storey, and due to the heat put out by halogen spots they need a 300mm(!) void around them, which precludes sound insulation in the floor. I could use spots on a ceiling rail but these get mighty dirty. Also, the efficiency of halogens is low.
For these reasons I am thinking of using LED spots which plug into GU10 fittings (these, for example) or even cold cathode spots (such as those stocked here). They're (very) expensive but long-lasting and low power, so might just earn back the extra initial expenditure.
What is your opinion on such multi-LED spots for domestic use? I know that you've reviewed plenty of other consumer LED devices. It seems that this must be an area LED manufacturers would like to get into, seeing as I only own two torches (maybe three?), but my house will require 10 or so spots at, the very least.
Unfortunately, the efficiency of white LEDs isn't any better than halogen.
Well, that's not quite true any more. Current cheap Chinese white LEDs continue the tradition of only giving about as many lumens per watt as halogen lamps, but the new Nichia 5mm LEDs have pretty much doubled output and considerably better efficiency as well, and the big Luxeons have always been a bit better in lumens per watt than common white LEDs. I don't think the new Nichias have filtered through to the downlight-replacement makers yet, though, and even when they do they won't solve the problem.
Pages like this, which admit the thing draws only one watt and say it's got "20 watt" light output - presumably meaning it's as bright as a 20 watt halogen - are what us professionals refer to as a pack of lies.
(In my original reply, I pointed to a three watt downlight with otherwise similar specs; it's been replaced by this one. I'll be charitable, and assume that the power reduction for the same alleged light output is the result of more efficient LEDs. Maybe the new Nichias are being used for downlights now.)
There's a good chance this LED downlight has the same light intensity right under it as you'll get from a 20 watt halogen, but that's because it's got a much narrower beam. OK for accent lighting, not so useful for general illumination.
Equalise the beam widths and a 1W LED downlight, even with the latest LEDs in it and even taking into account the higher perceived brightness of blue-white LED light compared with yellow-white halogen, will still give you no more illumination than a 5 watt halogen. And if you want 'em, there are currently plenty on eBay pretty cheaply, from sellers whose own silly claims are forgivable, once you look at the price tag.
Fluorescent is much more efficient than white LED, but small cold cathode tubes generally aren't. I'm willing to believe that someone's made a CCFL downlight replacement with similar efficiency to a normal compact fluorescent lamp, though, which makes the LED Lightbulbs claims somewhat plausible. Only somewhat, though.
A very ordinary cheap halogen downlight will score about 14 lumens per watt (brighter ones manage 20, though they don't necessarily last as long). LED Lightbulbs claim their CCFL downlight is 40 to 60Lm/W.
OK, sure, let's say 60.
With a quoted warmed-up run power of four watts, though, that means it ought to have the brightness of a 17 watt halogen, not the "35 watt equivalent" they claim. They're pulling a factor of two out of their fundamental orifice.
I don't know what's with the "Color temp: 2000-12000K" thing, either. A colour temperature of 2000 Kelvin is orange. 12000 Kelvin is sky blue. I'm betting that this lamp never outputs either hue.
I think these people are just copying and pasting stats from the manufacturers of these lamps, and the manufacturers are in turn playing fast and loose with the truth.
The advantage of LED downlights, of course, is that their standard form factor lets you plug different ones in when better versions arrive or you want trippy coloured lighting for a party. The disadvantage is that that same standard socket lets someone plug a 50 watt halogen in there and burn the house down. Just installing downlight sockets into which normal downlights must never be plugged may be a building code violation where you live.
Personally, I'd either bite the bullet and install downlight sockets to code, starting with halogens and then installing high output LED or fluorescent lamps in them when they're available - or just forget about flush-to-ceiling lights. Install ordinary light fittings that can accept compact fluorescents, and use those.
It's possible now, by the way, to buy very high powered compact fluoro lamps, if you need to light a workspace or large area. Seventy-something or eighty-something watt models are easy enough to find (without being huge weird expensive grow lamps for indoor hydroponics enthusiasts). The 85 watters are "425 watt equivalent", and from my previous investigations, I suspect these light specs will be pretty close to honest.
This beats my previous strategy, of cascaded double socket adapters.
(Hey! An opportunity for another Big Clive link!)
Three watt Luxeon LEDs would appear to be just making it into the MR16-replacement market, if products like this and this are anything to go by. They've got three-watt-ish power consumption, and are described as "5W" lights, which is probably a fair description of their halogen-equivalent brightness. The "3 to 4 times greater reliability" part is a bit weird, though; a really long lasting MR16 halogen will hang in there for only around 10,000 hours, and lifespans of only a few thousand are common. An LED lamp that really does last for the usual boilerplate-specification 100,000 hours (and many will last a lot longer, but will get dimmer and dimmer as the years go by) gives you a lot more than "3 to 4 times" the lifespan.
I recently sold my GeForce 6800 because I just got a place at uni and really don't want to be tempted to play loads of games. Secondly, I thought blowing £100 on a card was quite silly if you're not a gamer.
In the end, I bought a second hand 64Mb GeForce Ti4200. In 2D I notice no difference between the 4200 and the 6800. After unjamming the 4200's fan, which incidentally had not been running for the previous owner for a number of years, it worked fine.
However the tiny fan on the card sounded horrendous. As my machine was build to be quiet it had to go - more on that in a sec. I was quite impressed by the way the card managed to run for years without the fan functioning - with the tiny heatsink the card had, it must have been cooking.
Even so, with the fan running the back of card behind the core is too hot to touch for more than a few seconds.
In the end, I removed the auditory torture heatsink and replaced it with a large black Pentium II heatsink I found at my friend's house. I used standard silicone goop on the core and superglue around the edge of the processor and on other parts where the heatsink would touch the card. It stuck surprisingly firmly...and permanently. I would have used a better quality heatsink glup because they do not dry out so quickly... but the stuff I had was capacitive and conductive. If it had gone wrong I reasoned it would be difficult to remove the heatsink and clean the paste, so I chose the non-conductive type.
Being entirely passive, the card is now silent. The heatsink gets nice and warm in games, but not TOO warm.
Now on to the question ;) My heatsink does not make contact with the card in a number of places, though these places are quiet warm (a voltage regulator I think, and a few transistors - these were not cooled before). I was wondering if it would be OK to squirt standard white silicon heatsink paste behind the heatsink in these places?
(Mainly just for the hell of it, because it will probably not do a lot. The Pentium heatsink has holes in it from the old rivets, giving me an easy access point for a syringe of goo).
One last thing - why is the back of the card just where the core is soldered on still baking hot? With the original heatsink it was hot and now with the new heatsink which does not get too hot, the back of the core is still hot! I would have thought it would have cooled down by now...
1: Many modern video cards are OK with little cooling as long as they don't have to run in 3D mode.
2: People underestimate how hot chips (and other computer components, for that matter) can get and still work fine. It seems wrong that a chip can be too hot to hold your finger on and still perfectly happy, but it's true. It doesn't hurt to cool them off - and it can provide distinct lifespan benefits for things like hard drives - but a lot of silicon hums along just fine well above 65 degrees C, which is the kind of black-car-sitting-in-the-sun temperature that doesn't play well with humans.
(The hottest saunas can actually exceed 100 degrees C, but they have near-zero humidity, you don't spend long in there, and the wooden benches you sit on don't transfer heat to flesh well, even if you don't put a towel between them and you. "Wet" saunas don't get much above 40 degrees C.)
Anything on the card that wasn't cooled by the original heat sink very probably doesn't need to be cooled at all, but if you want to goop the place up just for fun, feel free. Normal thermal compound (as opposed to some metal-bearing super-compounds) is nonconductive and can be slathered all over the place, as it often is by beginners who put far too much on their CPU.
Tech support humour sites often feature pictures of PCs that've been assembled by someone who bought a 50 gram tube of white thermal goop and thought you had to use all of it in one go, but those PCs are not generally in the hands of a support person because of the grease explosion; CPU cooler retainers have enough tension to squeeze most of the goop out and thus still manage a half-decent thermal joint. The clueless owner generally does something else that actually kills the computer.
If excess grease gets runny when it's hot, though, it may drip out and end up somewhere 'orrible. On your own head, or carpet, be it.
Why is the back of the card hot? Because it's part of the cooling solution. Many modern chips with 2D arrays of pins or Ball Grid Array contacts derive a significant amount of cooling from heat conduction through those contacts and into the circuit board. There's a lot of copper there.
Again, this can result in an apparently alarmingly toasty board, while actually still being perfectly within spec for the components.
But wait - there's more! Click here to go to page 2 of this letters column!