Diamond Viper II graphics card

Review date: 9 April, 2000.
Last modified 03-Dec-2011.

 

Want a brutally fast PC graphics card to play games on?

Buy something with NVIDIA's GeForce, or faster GeForce DDR, chipset in it.

Can't afford a GeForce (starting price $AU460 or so)?

Ah, now things get interesting.

There are lots of not-terribly-new graphics cards in the lower price brackets - a few 3dfx Voodoo 3s, earlier NVIDIA designs like the assorted TNT2 chipsets, Matrox's rather good G400, ATI's peculiar dual-chip Rage Fury MAXX, and so on.

But there's also the Diamond Viper II. And it's a special case.

Diamond Viper II

Here it is, in a nutshell. The Viper II uses S3's Savage 2000 chipset. It is, at the moment, the only card that does; S3 and Diamond merged, and nobody else has been given the chipset, while lots of companies make NVIDIA chipset cards.

You'll pay a bit less than $AU400 for an Original Equipment Manufacturer (OEM) Viper II.

OEM cards are the product of choice for people who hate wasteful packaging; you get a card, a manual, and a driver disc, in an anti-static bag. There's not even a box, and there's no fancy software bundle to drive up the price either.

For your money, you get a card that can beat a GeForce, for some games, and particularly at lower resolutions and with slower CPUs. The Viper II is, in this regard, a better upgrade choice for people with less than cutting edge PCs.

And the Viper II makes a great DVD playback card; it's got a pair of TV outputs built in as standard.

But if you're considering buying one, there are other things you should know. Diamond/S3 promise things they don't deliver, and in some ways the Viper II is a blast from the past. Which, in PC gaming, is not a good thing at all.

What you get

The Viper II is a plain-looking AGP card. It supports AGP 1X, 2X, and 4X compatible motherboards, but like some older cards it requires you to change three jumpers on the card to enable 4X mode. This avoids compatibility problems with motherboards that incorrectly report their AGP capabilities.

Because it's an AGP card, you'll need an AGP slot on your motherboard, and an AGP-aware operating system - the final version of Windows 95, or any version of Windows 98, or some version of Windows 2000, for instance.

The card's cooled with just a densely packed heat sink on the main chip, and no fan. If you want to wind up the card's clock speed (with, for instance, PowerStrip) then it might be an idea to attach a little fan to the heat sink. This might not be a bad idea for Australian users, anyway; the 0.18 micron manufacturing process used in the Savage 2000 chip keeps its temperature down somewhat, but it still runs hot.

The default core and memory clock speeds for the Viper II are 125 and 143MHz. Before the production chipset arrived, S3 quoted some most impressive numbers from a 200MHz version of the chip, but actually making these in quantity appeared to be beyond their capabilities. This isn't unusual, by the way; graphics card manufacturers often quote optimistic high-clocked statistics while they're developing their new chipsets, to discourage buyers from spending money on competitors' products.

What you don't get

S3 went further than this, normal, level of misdirection during the Savage 2000's development, though. They're still telling fibs right now.

According to the manufacturers, the Savage 2000 was going to take the fight to NVIDIA, with on-board transform and lighting (T&L) support just like the GeForce. Transform is the process of turning mapping 3D data into the 2D rendition you see on the screen; lighting is just what it sounds like, the process of figuring out how brightly lit everything is by one or more in-scene lights.

Moving the transform and lighting portions of the 3D rendering process onto the graphics card reduces the load on the computer's CPU, and substantially speeds up rendering. The GeForce was the first affordable chipset to support this feature.

And so, according to S3, does the Savage 2000, with their so-called "S3TL" technology.

But it doesn't, actually, do it yet.

This is a bit embarrassing for S3. The card's been out for something like four months now - which is quite a long time, in the PC graphics world - and for all of that time, in big letters on the box and all over the Web site, it's said that the thing has hardware T&L. But it doesn't.

T&L support is officially included in the current Viper II drivers, which you can find in this FTP directory.

A couple of previous versions of the drivers had it too, but you had to turn it on with a simple little Registry hack. Now there's a proper option for it.

The Viper II hardware T&L does make a difference in many tests. Unfortunately, even as it increases the speed of OpenGL games, it screws up the graphics. Glitches galore. And it doesn't work at all in Direct3D. It's clearly not ready for prime time yet.

S3's original line was that the T&L support, though not in the first version, would be added in a driver update in "Q1 2000". Well, that's come and gone now, and there's still no definite arrival time for the working T&L driver.

This isn't the first time driver hassles have handicapped an otherwise promising piece of hardware. Matrox alienated plenty of users of their G200 cards, for instance, by taking more time than the mind could comprehend to come up with a proper OpenGL driver for them.

Mind you, although hardware T&L isn't something that needs to be specially supported by games, many games don't benefit very much from it. This is because they're written to run OK on the large majority of graphics cards without hardware T&L. If a game has tons of lights all over the place, which the GeForce can elegantly handle, it'll crawl on other cards. So hardware T&L is under-used at the moment.

S3 are, therefore, quite right when they say that hardware T&L is not yet a terribly important feature. But that doesn't let them off the hook. Other manufacturers with plain non-hardware T&L cards don't get to claim otherwise on the packaging - why should S3?

Sure, it's likely that sooner or later the working drivers will be out. But we've heard that before. Matrox G200 users were saying it about the OpenGL driver, for instance, until their expensive graphics card was a generation and a half out of date. When you can get a GeForce with mature, stable drivers right now, and pay not a whole lot more, the Viper II had better have some serious speed.

Fortunately, for some games at least, it does.

But first, you've got to get it running.

Setting up

On a test machine with an Abit VA6 motherboard (review coming up shortly!), the test Viper II failed utterly to run any OpenGL games or benchmarks. OpenGL just couldn't be used; any program that tried to use it crashed. This, however, was without the AGP VXD from the VIA "4-in-1" drivers installed. Without this special AGP driver, you can expect Via Apollo Pro 133 and 133A motherboards (the VA6 is an Apollo Pro 133 board) to misbehave.

So I installed the AGP VXD, from the 4.20 4-in-1 package available here.

Now, the OpenGL non-functionality wasn't a problem, because Windows didn't start at all.

Neither the Normal or Turbo modes for the VXD worked; I could restart in Safe Mode and change things, but couldn't boot normally with the VXD installed.

Apparently, you're supposed to install the 4-in-1 drivers, then the Viper II ones. Well, I did that. It still didn't work.

I tried the older v4.17 4-in-1 drivers, as well. And Windows, once again, presented a modern artwork which I believe I shall title "Alone"; a mouse pointer, palely loitering in the middle of a big black blank screen.

Given that VIA Technologies have recently announced their intention to buy S3's graphics card division, this all struck me as somewhat ironic.

There is, presumably, some magic combination of older 4-in-1 drivers and older Viper II drivers that works. I did not have the time, or the patience, to find it.

Anyway, I switched the Viper II to another test machine, with a 650MHz Athlon on an ASUS K7M motherboard (it's the computer I talk about building here). According to S3, you need to make sure you've got the latest AGP miniport driver (v4.61, as I write this) from here, first, so I did that. And the Viper II worked. And there was much rejoicing.

Like other Diamond graphics cards, the Viper II's default install also gives you Diamond's InControl Tools, which is a fairly elegant central control system for all of your graphics cards settings.

You don't have to install the InControl Tools stuff, or even use the stock Viper II setup program; you can just expand the setup files to the directory of your choice, instead, and select the driver manually, pointing Windows at that directory when it asks.

Performance

To dip for a moment into tech-speak, the Savage 2000 has a dual pixel, dual texture rendering pipeline, while the GeForce has a quad pixel, single texture setup. In English, this results in the Savage 2000 losing to even a base model Standard Data Rate (SDR) GeForce in raw fill rate; it can do only about 250 megapixels per second, while the GeForce can do 480.

Fill rate is what you need to just put pixels on a screen, when nothing fancy's happening in the texturing department; a graphics system is said to be "fill rate limited" when it's fast enough to calculate everything's location and how that maps into a screen image, say, 70 times a second, but the graphics card can only paint, say, 40 frames per second onto the monitor. Lower the resolution, so fewer pixels need to be painted per frame, and a fill rate limited system will get a higher frame rate.

The "dual texture" part, though, lets the Savage 2000 narrowly beat the GeForce in textured pixel ("texel") speed, in games that use multitexturing - textures on top of each other. It's only by a little bit - 480 megatexels per second for the GeForce, 500 for the Savage 2000, with both cards running at stock speed. But the difference is there. As long as the texturing speed's the limiting factor, the Savage 2000 wins, in raw numbers.

If all that matters is the ability to push pixels onto the screen, and there's not so much texturing going on, the GeForce can streak ahead by almost a factor of two.

All of this, of course, assumes equally efficient drivers. Which is not the case. And this is another place where the Viper II makes things complicated.

So many standards to choose from...

Modern 3D PC games all work with one or more of the three main 3D Application Programming Interfaces (APIs). An API is a way for programmers to connect their software to all sorts of hardware; if a video card's driver supports, say, the OpenGL API, then any OpenGL game, past or future, will work with that video card. And any OpenGL-supporting video card, past or future, will work with that game. Old ones may not work very quickly, mind you, but if you wind the resolution and detail down, they'll probably be playable enough.

OpenGL is, as its name suggests, an open standard; anybody can write OpenGL drivers for their hardware, for free. Microsoft's Direct3D is the other open PC 3D standard. And then there's 3dfx's proprietary Glide standard.

These days, though, Glide is fading into the background, because 3dfx jealously guard it. It's speedy, and it's easy to develop for, but only 3dfx cards have drivers that support it.

The decline of Glide would appear to suggest that there's not a lot of room in the market for a new proprietary 3D API.

Needless to say, S3 decided to make one.

The S3 "MeTaL" API works only with their Savage-series cards. On the plus side, this means that it's well tuned to S3's hardware, and very fast; Glide has the same advantages on 3dfx hardware. On the minus side, game manufacturers don't seem to care much about this API.

MeTaL is used by the following current games:

1) Unreal Tournament

And that's it.

It's no big deal to use MeTaL if you've got a Savage 2000; the Unreal Tournament driver comes on the software disc, with more recent versions available from the support page here. Tell Unreal Tournament to use MeTaL and presto, frame rates improve.

Texture squishing

S3 Texture Compression (S3TC) comes along with MeTaL, but it can also be used with other APIs.

S3TC reduces the amount of memory space textures take up, and it's implemented in hardware on the Viper II so it doesn't reduce speed. In fact, it increases it. And by using S3TC, much more detailed textures can be used, if a game has them to offer. S3TC uses rather heavy compression, though, and makes lower resolution textures very ugly and chunky.

Quake III: Arena (Q3A) enables S3TC by default if you use a card that can handle it, and the game suffers a slight image quality loss as a result, even when texture detail's set to maximum. Pick the lower detail settings and a lot of textures, particularly sky textures, look frankly awful. In a fast moving game, though, it's surprising how little this matters.

You can turn off S3TC in Q3A by editing the q3config.cfg file, in the baseq3 subdirectory of the game directory, and changing the r_ext_compress_textures line's value from "1" to "0".

Turning off S3TC hurts frame rate somewhat; it depends on the resolution and the processor, but you can expect to lose something like 20%.

The GeForce, by the way, supports S3TC and four other kinds of texture compression, but only in Direct3D mode. It doesn't in OpenGL, because NVIDIA hasn't bothered to license the technology from S3. S3TC is part of the free DirectX standard, but it's not free if you want to support it in OpenGL. This is why Q3A, an OpenGL game, has S3TC support only when you're using an S3 graphics card.

Making numbers

The creators of the Savage 2000 drivers are, it seems, a bit crafty. They know that the flagship 3D action game of the moment is Quake III Arena (Q3A), and they know that reviewers tend to do graphics card benchmarks using Q3A's timedemo mode.

So they optimised the drivers to suit Q3A, in particular. With the result that the Quake 3 frame rate numbers for the Viper II are smokin' fast. If all you ever play is Q3A, you'll be well pleased with a Viper II.

The 9.10.30 driver version is the most recent officially released package for the Viper II, although you can, as I write this, get pre-release 9.10.34 drivers from the FTP directory here, if you're game. I used the release drivers for these tests.

It's similarly impressive for Quake III: Arena, which is an OpenGL game. In Q3A's "Normal" graphics mode, 640 by 480 with modest graphics quality settings, the standard Q3A demo #1 zipped past at much the same speed on both cards, with a small advantage for the GeForce. The Viper II pulled away as I turned the quality up, though; in 800 by 600 "High Quality" it won by a head, and when I wound it up further to 1024 by 768, with maximum texture detail and high geometric detail, the SDR GeForce lost embarrassingly.

Quake 3 results
(frames per second, Demo001)

Quality Viper II GeForce SDR
Normal 72.5 75.1
High 65.1 61.8
1024 by 768, high geometric detail, maximum texture quality 45.9 34.2

The Viper II is also an excellent card for Unreal Tournament. Its special MeTaL driver delivers better performance than an SDR GeForce.

But once you get away from the Q3A-optimised OpenGL, and the UT-only MeTaL, the Viper II looks less exciting. 3D Mark 2000, the rather pretty Direct3D game performance benchmark from MadOnion.com, does the Viper II no favours. On the 650MHz Athlon box, the Viper II scored 2923 3DMarks, while the GeForce clocked in at 3574.

This is a much better result than early Viper II drivers delivered; Viper II Direct3D performance seems quite acceptable now, and there are none of the bizarre visual glitches that made previous benchmarking efforts pointless.

And, like any card without hardware T&L, the Viper II sucks in a manner previously only seen in holes in the side of spacecraft when you ask it to run NVIDIA's TreeMark demo, which has little glowing butterflies flitting around a many-leafed tree.

TreeMark

In the Simple TreeMark benchmark, the GeForce scored 34 frames per second, with the Viper II managing 10.5; in the Complex benchmark, the GeForce got 10.7, versus a mere 2.3 for the Viper II.

Video quality

The Viper II does a sterling job of DVD playback, but that's no big deal unless you've got a slow CPU. Software DVD players working through just about any current graphics card will give results just as good as the newest and grooviest of high speed cards, as long as you've got enough CPU grunt. And anything faster than a 300MHz P-II or Celeron ought to be enough.

That old 300MHz P-II will be pretty much incapable of doing anything else while it's playing a movie, mind you, but that's probably OK - how much else do you want to do?

The extra hardware DVD acceleration in the Viper II is the same, in essence, as that in various other cards; it's motion compensation. Motion compensation isn't actually a quality-enhancing feature; it reduces CPU load substantially, but it doesn't look quite as good. Switching to all-software mode, and loading the CPU up some more, gives a better picture.

The dual video outputs on the Viper II, though, are a nice feature. Other cards have a single, dual function output that looks like an S-Video connector; you plug in a little adaptor cable that terminates in an RCA socket if you want composite output (see my video guide here if you're wondering what these terms mean). No such fiddling is necessary with the Viper II, since both kinds of connector are provided, and the card notices when something's connected to them on startup.

The included DVD player software is Zoran's OK-but-not-great SoftDVD.

Overall

The Viper II's smoking performance in Q3A and UT makes it an excellent choice for anybody that's interested in these games, particularly if they don't have a terribly speedy processor. At 640 x 480 or 800 x 600 resolution, it roughly equals the performance of a more expensive SDR GeForce, and it's actually faster at higher resolutions. But, remember, its Q3A and UT performance is a best case scenario, and the drivers aren't tuned for other games.

Most people, however, are not in the mood to clown around with different drivers and BIOS patches and registry tweaks and so on just to get the features listed on the box of their shiny new video card. This isn't to say that bleeding-edge PC gaming doesn't often makes you fiddle with your setup, at least a bit. Current hot 3D cards are famous for not working on older, cheaper motherboards with low-power linear voltage regulators, for instance, and there are lots of games that misbehave on one card or another, until a patch to fix them comes out.

But video cards that still don't do what the manufacturer says they can do, four months after release, are to my mind only worth bothering with if they're a huge bargain. And the Viper II isn't, really.

Given its price, the Viper II can't afford to lose too badly to the GeForce. The OEM 32Mb version of the Viper II, which is the one I checked out, is retailing locally for $AU395. For only $AU475, you can get a Leadtek GeForce, also with TV out, and with a software bundle to boot - although the Leadtek software bundle's nothing to get excited about, as I explain in my GeForce comparison here. $AU535 will buy you a Gainward Cardexpert GeForce DDR board, which I review here; it's a perfectly good top-spec GeForce that benchmarks as well as $700 boards with bigger brand names. There's no TV out on the Gainward board, though.

If you've got a slower processor - which, these days, means just about anything running below 500MHz - then DDR GeForce won't give you a huge advantage. It will, however, let you maintain the same not-too-exciting frame rates at higher resolutions.

For $AU30 or so less than the Viper II, you can get an OEM AGP or PCI Voodoo 3 3000, which is not nearly as fast and has only 16Mb of video memory, but has much more solid drivers, including Glide support. A Voodoo 3 2000 will cost you $AU100 or so less again.

OEM GeForces can be had in the USA now for less than $US170. This is only about $US40 more than similar deep-discounter prices for the Viper II, and it lines up roughly with Australian pricing, too. GeForce boards are just not that much more expensive than S3's offering.

If I didn't need a new video card right now, I'd hang on for a little while longer. Towards the end of this month (April 2000), NVIDIA will announce their next generation chipsets. These won't be available on an actual retail card for a while after that, but the announcement will drive prices for current down further still. 3dfx have their Voodoo 4 and 5 boards well on the way, too.

The Savage 2000 is an interesting chipset, and coulda been a contender, but if you ask me, the competition's just too stiff. If all you play is Q3A and UT, and you're tolerant of lackadaisical driver development, then a Viper II might suit you, if it works on your motherboard. But there are better options.

Pros:

Cons:

  • Blazing Q3A and UT performance
  • Cheaper than GeForce
  • Drivers still not quite done
  • VIA chipset incompatibility?

Main Viper II product page

S3's Viper II support page
(includes driver downloads)

Savage2000.com
(S3 chipset and other graphics news)

Evil Smokin' Dude's Viper II review
(they're pretty uncomplimentary about it, too :-)

Glossary

AGP: The Accelerated Graphics Port is based on the PCI standard, but clocked at least twice as fast to accommodate the demands of 3D graphics. AGP lets the graphics board rapidly access main memory for texture storage.

Colour depth: The number of distinct colours that a piece of hardware or software can display. It's referred to as depth, and sometimes as bit depth, because of the concept of overlapping, stacked "bitplanes", planar arrays of ones and zeroes that, together, define the colour of each pixel. The more bitplanes there are, the more bits per pixel, and the more bits per pixel, the more possible colours - number of colours equals two to the power of the number of bitplanes. 16 bits gives you 65536 possible colours, and 24 bit offers 16.8 million. Cards that do more than 24 bit use the extra bits for mixing channels and other funky stuff - 24 bit is more colours than the eye can discern already.

This is significant for gaming, because running your games in 24+ bit mode may be prettier, but will be slower. The image quality difference is not a large one; in Quake 2 you have to look hard to see the vague banding on walls in order to tell you're in 16 bit mode, and in a real game you don't have much time for that. Games with funkier engines that do fog mixing and similar tricks benefit more visually from 24 or 32 bit, but since going for 16 bit will let you run a higher resolution at the same speed, most gamers opt for fewer colours.

OpenGL games inherit the colour depth of the desktop when you run them; if you're running 16 bit in Windows, that's what the game'll be. Remember this if you run your favourite game and it seems strangely slow; check your desktop colour depth. Direct3D games choose their own colour depth, and may or may not be switchable between 16 and 32 bit mode. Some, like Incoming, come in different versions for different colour depths.

Direct3D: Microsoft's own 3D graphics Application Programming Interface (API), which serves the same function as OpenGL and Glide - programmers can use the API to get their software to work on any hardware with Direct3D support, instead of having to write their own drivers for every 3D board out there.

Glide: 3DFX's native 3D graphics standard, as used by the Voodoo cards of all flavours. When a game has rendering options that say something like "Standard OpenGL" and "3DFX OpenGL", the second option's Glide.

OpenGL: The platform-independent 3D graphics interface standard, with different flavours developed by Silicon Graphics and Microsoft. Does much the same thing as Direct3D and Glide, but does it on any computer you care to name.

Refresh rate: It's not enough that a given graphics system support the resolution and colour depth you want. It must also do it at a reasonable refresh rate. Refresh rate, measured in Hertz (Hz), is the number of times per second the screen is "repainted" with the image. Refresh rates below about 72Hz cause visible flicker; higher rates don't. Different people have different thresholds of annoyance when it comes to screen flicker, but staring at a 60Hz screen all day is an unpleasant experience for pretty much anyone. In gaming, refresh rate is not so critical, because you're generally not staring intently at relatively stationary objects in great fields of solid colour. But you still want 75Hz or so, if you can get it.



Give Dan some money!
(and no-one gets hurt)