ASUS AGP-V3800 Ultra TNT2 graphics card

Review date: 8 July 1999.
Last modified 03-Dec-2011.

 

TNT2 cards are all very well. They're very fast. They're pretty cheap. They're all most people want from a 2D/3D graphics card, these days. But if you're just not satisfied with TNT2, TNT2 Ultra is what you need.

The only difference between plain TNT2 and the Ultra variant is speed. Plain TNT2s have a stock clock speed of 125MHz for the core and 150MHz for the memory - some come with a faster stock clock speed, and many can run significantly faster if overclocked with a program like Powerstrip (from Entech Taiwan, here) or the excellent little TNTClk utility. TNT2 Ultra cards, on the other hand, get the more capable main chips and faster RAM, and have a stock speed of 150 core and 183MHz RAM. And, again, can probably run rather faster.

An Ultra card doesn't have to have all of the fancy top-flight extras - video out, video in, partridges, pear trees - but many of them do, simply by virtue of being their manufacturers' flagship cards.

V3800 Ultra card

The ASUS AGP-V3800 Ultra is one such card, with the Ultra chipset, on top of all of the other features of the V3800 Deluxe. That means 32Mb of RAM, composite or Y/C video in and out (see my digital video explanation page here to see what this means), and not just a connector for ASUS' VR100 3D glasses, but the glasses, too! Unfortunately, the glasses aren't actually very good (I review them here), but they don't add much to the price, and could be fun to experiment with.

What you get

Apart from the clock speed, the V3800 Ultra package is the same as the V3800 Deluxe package, which in turn differs little from the V3800TVR I review in the three-card comparison here. You get the 3D specs, of course, but that's about it for extras. Card, video adaptor cables for TV or VCR or camera connections, standard driver disk (with lousy drivers; download the more recent ones here), Turok: Seeds of Evil and Extreme-G 2 game discs. For the full rundown on the quality of the ASUS package, check out the three-card comparison; in essence, it seems to be a perfectly well made card, with a fan on its main chip (which is pretty much essential for Ultra cards), and the supporting stuff is OK, although the driver could be better - and is, with the latest, downloadable version. You can use the V3800 with nVidia's reference Detonator drivers, but its fancy video and 3D glasses features won't work.

Setting up

Installing an Ultra card is no harder than installing a plain TNT2, which is in turn no harder than installing any other AGP graphics card. You'll need, of course, an AGP slot and an AGP-compatible operating system, like Windows 98. Windows 95 Release B will do, with the right AGP/USB drivers installed for your motherboard, but 98 makes it easier.

Switching from one TNT2 card to another, especially from a plain one to an Ultra, can be tricky. Windows tries to be intelligent and fails (amazing, I know). It autodetects the new card, but it keeps enough of the old driver files that the new card won't work properly.

The dead giveaway that your old driver hasn't been tidied up properly is a Display Properties box that only lets you select colour depths up to 16 bit, not 32 bit, and requires a reset instead of applying the changes straight away. After you reset, Windows brilliantly informs you that there's something wrong with your video setup, and you're still in 16 colour 640 by 480. Changing the driver manually won't help you - unless you switch to the plain VGA driver first.

driver200.GIF (6237 bytes)

Forget whatever brilliant auto-detected driver Windows has installed, unless it by some miracle works. Go to Display Properties, click the Settings tab, the Advanced button, and then the Adapter tab and the Change button. Click Next, select the "Display a list..." radio button and click Next again, select the "Show all hardware" radio button and then select the very top option in both window-panes, which is the bog standard 16 colour 640 by 480 VGA driver. If Windows decides to play ball, it won't re-detect your card when you restart, and you'll be able to do the change-adapter routine again, this time pointing Windows to the drivers you actually want.

Switching to the VGA driver before you remove the first card, which is what various AGP graphics card manuals suggest, doesn't seem to help. You've got to swap cards, let Windows screw up the driver if it wants to, then switch to VGA, then install the proper driver.

If this doesn't work, nuking your Windows directory and reinstalling probably will. But if you're upgrading from some completely different graphics card, not another TNT board which Windows in its infinite wisdom can't tell from the new one, the problem shouldn't arise.

Performance

The reason to buy an Ultra TNT2 is speed. And the V3800 Ultra delivers.

The OEM Diamond Viper V770 I had in the test machine (ASUS P2-99 motherboard, reviewed here, Celeron 300A CPU overclocked to 450MHz) has a stock speed of 125 and 150MHz, for core and RAM respectively, and will overclock only to 140/165MHz, which made it the least exciting, speed-wise, of the three cards I reviewed for the comparison. At stock and overclocked speeds, respectively, it manages 45 and 50 frames per second in a timedemo of Quake 2's demo2.dm2, in 1024 by 768 resolution, 16 bit colour.

At its stock 150/183 speed, the V3800 Ultra didn't score any better than the overclocked V770, which is odd, because it's clocked about 10% faster. I ran the test over and over, though, and didn't see an improvement.

No matter. The V3800 is happy to overclock. With the simple little TNTClk program, I managed to wind the V3800's core speed up to 175MHz (180MHz hung the computer, and I couldn't be bothered seeing if some intermediate value was stable, since the difference is minuscule). I couldn't find a ceiling for the RAM speed; it was still running fine, with only the tiniest of visible glitches, at 243MHz. This is as far as TNTClk can wind it (incidentally, pulling the memory speed slider all the way to the right consistently hung TNTClk, for some reason), and 18MHz faster than the popular shareware Powerstrip can manage. Monstrous RAM overclocking isn't useful, though; it's the core speed that really matters.

At the 175/243 speed, demo2.dm2 ripped by at 60 frames per second, a respectable 25% faster than the V770. It still managed better than 52 frames per second in 1280 by 960 resolution, which is the highest my poor old 17 inch monitor can handle!

The Crusher demo, at 1280 by 960, scored 39.9fps. This was about the speed of the old TNT-1 in 640 by 480, on the same processor. The Ultra card is pushing four times as many pixels. The TNT-1, at 1280 by 960, doesn't manage much more than 20fps in Crusher.

The Crusher demo is so named, of course, because it's a worst case scenario. Quake 2 play on a 450MHz P-II class processor will be perfectly acceptable in 1034 by 768 on a TNT-1. It'll be a little better on a TNT2, and a little better again on a TNT2 Ultra, but if all you want to play is Quake 2, then a cheap OEM 8Mb TNT card is the way to go. 30 frames per second on Crusher is generally accepted to be as much as all but the very funkiest players need.

TNT2s are better suited to newer games, like the upcoming Quake 3. The 32Mb RAM of top-end TNT2 boards isn't terribly useful even for these games, but it comes in handy for 3D rendering, allowing high-detail, high-speed previews of the kind that could only be done by very pricey workstations a few years ago.

My favourite quick and dirty synthetic benchmark is WinTune 98, and it showed less impressive results from the Ultra board. The overclocked V770 got a Direct3D score of 179 megapixels per second, and an OpenGL score of 147 megapixels per second; the overclocked Ultra scored 209 and 168, respectively. 17% and 14% faster, respectively. 2D graphics performance was almost identical, of around 104 megapixels per second for both cards. Which, by the way, is rather more than anyone needs, as are the ludicrous resolutions and refresh rates supported by all TNT2 cards. If your desk does not groan under the weight of a multi-kilobuck monitor, rest assured that as far as 2D goes, any old TNT2 can pump out more pixels than your screen can clearly display.

Is it worth it?

You'll presently pay only about $75 (Australian dollars) more for the V3800 Ultra with all the fruit than for the Deluxe version with all the same trimmings, but no Ultra chip. Since even the base model V3800 has a fan on its chip cooler, you'll probably be able to clock it to around 160/175MHz (the result I managed with the one I reviewed for the comparison). This means the Ultra card, if it performs no better than this one did, will beat it by only about 10%, but cost 17% more.

On the other hand, if you get a cheap and cheerful V770 or some other OEM card with no fan, you're likely to be looking at 25% better 3D performance from the Ultra, for more than twice the price - the 16Mb OEM V770 lists, at the time of writing, for only $235. If you don't want the V3800 Ultra's video in and out and 3D glasses, it's pretty clear that the extra money isn't worth it just for the speed.

So if you're considering a V3800 already, the Ultra won't cost you a whole lot more. But it probably won't give you a whole lot more performance, either. If you just want to play games, a cheap OEM card delivers much better value for money, and isn't really all that much slower, either.

As all-bells-and-whistles graphics cards, the V3800 Deluxe and the V3800 Ultra are both winners, even if the 3D glasses aren't good for much. But I suspect a lot of the people stampeding to buy an Ultra card just 'cause it's the fastest would do better by spending half as much on a slower plain TNT2, and putting the difference towards more RAM or a faster processor. Or more games.

 

Pros:

Cons:

  • Darn fast
  • Full-featured
  • Not cheap
  • More features than most people need
  • Dud 3D glasses

Glossary

AGP: The Accelerated Graphics Port is based on the PCI standard, but clocked at least twice as fast to accommodate the demands of 3D graphics. AGP lets the graphics board rapidly access main memory for texture storage.

Colour depth: The number of distinct colours that a piece of hardware or software can display. It's referred to as depth, and sometimes as bit depth, because of the concept of overlapping, stacked "bitplanes", planar arrays of ones and zeroes that, together, define the colour of each pixel. The more bitplanes there are, the more bits per pixel, and the more bits per pixel, the more possible colours - number of colours equals two to the power of the number of bitplanes. 16 bits gives you 65536 possible colours, and 24 bit offers 16.8 million. Cards that do more than 24 bit use the extra bits for mixing channels and other funky stuff - 24 bit is more colours than the eye can discern already.

This is significant for gaming, because running your games in 24+ bit mode may be prettier, but will be slower. The image quality difference is not a large one; in Quake 2 you have to look hard to see the vague banding on walls in order to tell you're in 16 bit mode, and in a real game you don't have much time for that. Games with funkier engines that do fog mixing and similar tricks benefit more visually from 24 or 32 bit, but since going for 16 bit will let you run a higher resolution at the same speed, most gamers opt for fewer colours.

OpenGL games inherit the colour depth of the desktop when you run them; if you're running 16 bit in Windows, that's what the game'll be. Remember this if you run your favourite game and it seems strangely slow; check your desktop colour depth. Direct3D games choose their own colour depth, and may or may not be switchable between 16 and 32 bit mode. Some, like Incoming, come in different versions for different colour depths.

Direct3D: Microsoft's own 3D graphics Application Programming Interface (API), which serves the same function as OpenGL and Glide - programmers can use the API to get their software to work on any hardware with Direct3D support, instead of having to write their own drivers for every 3D board out there.

Glide: 3DFX's native 3D graphics standard, as used by the Voodoo cards of all flavours. When a game has rendering options that say something like "Standard OpenGL" and "3DFX OpenGL", the second option's Glide.

OpenGL: The platform-independent 3D graphics interface standard, with different flavours developed by Silicon Graphics and Microsoft. Does much the same thing as Direct3D and Glide, but does it on any computer you care to name.

Refresh rate: It's not enough that a given graphics system support the resolution and colour depth you want. It must also do it at a reasonable refresh rate. Refresh rate, measured in Hertz (Hz), is the number of times per second the screen is "repainted" with the image. Refresh rates below about 72Hz cause visible flicker; higher rates don't. Different people have different thresholds of annoyance when it comes to screen flicker, but staring at a 60Hz screen all day is an unpleasant experience for pretty much anyone. In gaming, refresh rate is not so critical, because you're generally not staring intently at relatively stationary objects in great fields of solid colour. But you still want 75Hz or so, if you can get it.



Give Dan some money!
(and no-one gets hurt)