Which video card?

Review date: 29 May 2003.
Last modified 03-Dec-2011.

 

Gee, it's easy to choose a video card, isn't it?

I mean, people who want a 2D/3D card for ordinary Windows work and games, and who've decided to go with an ATI or Nvidia chipset, only have to sort through about a dozen different current models from each of those two manufacturers.

And there are usually no more than two dozen versions of each model, from different card-makers.

Why, you could probably get one of every possible card for little more than $US100,000, and then just sort through them at your leisure! Easy!

Ain't market segmentation grand?

I used to think that people who paid top dollar for the newest and fanciest graphics card to hit the market - whose new features, invariably, won't be properly used by any game software until you can get something faster for half the price - all had more money than sense. Or were just poseurs. Or ran PC hardware sites and would simply die if anyone saw them using something with "MX" or "SE" on the end of its name.

But maybe some of these people just want a simple purchasing decision. "The heck with all these graphs and tables - give me the fastest thing you've got, and let me out of here!"

Herewith, my attempt to save you from that fate. I have endeavoured to sort out what's happening in the mainstream/enthusiast PC video card market, as of late May, 2003, determine what's value for money and what isn't, and present as un-terrifying a list of options as is possible.

I solemnly swear to subject you to no bar graphs at all. There'll be quite a few links to enormous reviews full of the cursed things.

If you only need a quick graph fix, there's always this one, which has served me well for years.

What you're paying for

It would be cynical and mean-spirited to suggest that graphics card companies need to justify the ever-higher memory speed and internal rendering power of their new gear, and are thus trying as hard as they can to persuade us that we all need ever-higher levels of Full-Scene Anti-Aliasing (FSAA) and, more recently, anisotropic filtering (AF). Which I could spend time describing, or just lazily link you to this excellent ExtremeTech piece on the subject.

FSAA and AF both burn off graphics card power very effectively. If you can live without either of them, then you can make do with surprisingly cheap video cards, and still get OK 3D frame rates.

Don't tell. It's a secret.

The contenders

First up: The video card race, right now, has ATI running in a narrow first place, over Nvidia, with a large expanse of daylight third. Then, there's a collection of Matrox and Trident and SiS and PowerVR products bringing up the rear, none of which need concern us now.

Secondly: Cheapskates will find that there's a floor price for video cards, below which you can't get anything decent, unless you head off to the dodgy dealers at computer markets and auction sites. It costs money to make a piece of multi-layer fibreglass with chips on it, and it costs money to put it in a box and get that box onto retail shelves.

As a result, budget video cards of somewhat recent manufacture simply can't be had (from reputable retailers) for less than about $AU100.

Sure, if you cruise the price comparison sites, you'll be able to find plenty of cards under $US30, but they're all decidedly elderly. Get up into the $US50/$AU100 price bracket and you find the real stuff, but also find that a GeForce2 MX will cost you around the same as a GeForce4 MX, despite being much slower.

For this reason, the GeForce4 MX is the slowest card I'm going to consider, here. As budget boards go, the various GF4MX models are still good. OK, they're more of a GeForce2.5 Ultra, when you look at the actual feature set. But the guy at the LAN party whose water cooling rig executes his computer, leaving him sitting in front of the GF4MX standby machine, is not going to be hideously disadvantaged as a result.

ATI's competition for the MX440 is the similarly venerable Radeon 9000. It's only a little faster for things the MX440 can do, but the 9000 has a full DirectX 8.1 feature set.

That didn't mean much when the 9000 first came out. It could complete all of the tests in 3DMark 2001, but the GF4MX couldn't. Big deal. Games that needed full DX8.1, then, were rare to nonexistent.

Now, though, DX8.1 is genuinely useful, and there's good reason for even casual gamers to get a card that supports it, unless they're only interested in old or new variations on pre-DX8.1 themes.

The 9000 and wee-bit-faster 9000 Pro (whose only real extra feature is that it can come with a DVI connector) have pretty much been eclipsed by the newer GeForce FX 5200. It's cheaper and faster.

(Yes, there are Radeon 9100s and 9200s as well, but they're not very interesting. The 9100's a re-launched 8500 which, with ATI's now-much-better drivers, offers a decent performance advantage over a GeForce4 MX to balance its small price premium, but it's a bit hard to find here in Australia. If you're curious, you can find the specs of every oddball Radeon that ever there was here.)

The most expensive video cards you can now get cost more than four times as much as these boards. You can buy a whole basic computer for the price of a current top-end 3D card. A video card needs to be pretty darn special to justify that kind of money.

So let's have a look at the rest of the ATI and Nvidia range.

The champ!

Radeon 9800 Pro

This is a Radeon 9800 Pro, which may or may not be the fastest PC 3D card in the world right now, depending on what weighting you put on minor features like actually being able to buy the card in a shop.

Nvidia's answer to the 9800 Pro is the GeForce FX 5900 Ultra. It's maybe a bit faster, but that's a bit of a moot point at the moment, because it doesn't yet exist in the retail market. Given two video cards of roughly equal capabilities, identical in all other respects except one of them doesn't exist, the one that does exist is the greater.

If the 5900 Ultra doesn't hit the market for about the same price as the 9800 Pro - it seems likely to be significantly more expensive - then it won't be especially exciting.

Ah, but isn't the GeForce FX able to do things the Radeon can't, I hear you ask? Like run those "Cinematic Computing" demos? Like Dawn?

(You know - that demo that takes great strides forward for the status of women in the IT industry, especially if you rename the "fairy.exe" file to "3dmark03.exe" or "quake3.exe".)

Well, the Radeon can run Dawn perfectly well (including both of the special renamed modes...), with a minor tweak to wrap the demo's custom Nvidia DirectX calls in OpenGL commands.

Wrappers hurt performance, but the 9800 Pro sure doesn't seem to give anything away to the 5800 Ultra when running Dawn. Whether the 9800 Pro actually runs Dawn faster than a 5800 Ultra is a matter for debate, since the Radeon doesn't quite render it the same way - Dawn's hair, for instance, is fluffily anti-aliased on the GeForce FX but jaggy on the Radeon. But there's no question that it's very much the same thing.

Devotees of Show-Off Computing don't like the 9800 Pro, though. Well, not the one I depict above, anyway. I mean, look at the thing. It's got a Molex power socket on the back to supplement the juice it sucks from the AGP slot, and it's red, but that's it for flashy features. It's got a quite modest cooler on its main chip, and it's got no heat sinks at all on the RAM.

9800 Pro back

Those bare RAM chips got some overclockers rather excited. Typically, video card performance is limited by RAM speed, and an overclock that yields a miserable 10% performance increase for real tasks is considered an excellent one.

Memory chip

The memory chips on the Radeon 9800 Pro are Samsung K4D26323RA-GC2As, which have a "2A" nanosecond rating. Two nanosecond RAM can run at 500MHz; 1000MHz, after DDR doubling. That's about 1.5 times the stock 9800 Pro RAM speed. Woohoo!

Regrettably, what "2A" means in Samsung-ese is "actually about 2.86". The maximum speed for this RAM is 350MHz, pre-DDR-doubling. So it's almost at max speed already.

Oh well. On with the show!

Telling them apart

Pinning down the exact performance of a given graphics card, no matter what chipset it's based on, is very tricky. Not only do you need more informative graphs than the usual one-number bar chart, but there's a good case to be made against using synthetic benchmarks at all, primarily thanks to the ongoing issue of driver-based cheating by the video card companies. That's been going on for a while, Nvidia seem to have been doing rather more shamelessly than ATI, lately.

And then, there's the un-level playing field. Any card with a given chipset and a given amount of RAM (or any amount of RAM, if you're not doing something that needs more memory than there is available) and given clock speeds for core and RAM will perform the same as any other. But card makers diddle around with the default core and RAM clock speeds. And, of course, so can users; overclocking features have been built into Nvidia's drivers for some time now, and PowerStrip works with everything out there, including the latest Radeons.

Generally, if a video card has "Pro" or "Ultra" on the end of its name, then you can expect it to just have slightly higher core and RAM clock speeds than the non-Pro/Ultra version. But oooh, no; ATI have to muddy the water by making a Radeon 9500 with four rendering pipelines and a 9500 Pro with eight. This is a much bigger change than a mere name-suffix usually indicates.

Every Radeon from the 9500 upwards has a full DirectX 9 feature set. This doesn't mean a whole lot yet; it's not as if running a mere DX8-capable card means you can't install DirectX 9. It just means that any attempt to use DX9 features will either be miserably slow (thanks to software rendering of stuff the hardware can't accelerate), or (more probably) just not work. Fortunately, not much software needs DX9 capability yet.

The Radeon 9500, 9500 Pro and 9700 all have the same default core and memory clock speeds, but the 9700 has a 256 bit memory interface that doubles its theoretical RAM bandwidth. The 9500 Pro and 9700 also both have eight rendering pipelines to the 9500's four, which gives them twice as much theoretical fill rate, and a noticeable performance advantage at higher resolutions, even without FSAA.

The 9700 Pro is just a 9700 with a core and RAM clock speed boost - only 18 and 15 per cent, respectively. But it's only slightly more expensive, so it's a worthy alternative.

The 9800 Pro has a more seriously revised core, which is clocked faster again. But only by about 16% and 9.5% for core and memory, respectively, over the 9700 Pro.

Clock for clock, the 9800 Pro's a faster card than the 9700 Pro, but not by enough that you'd notice. The slightly faster core and slightly faster clock together make the 9800 Pro noticeably faster than the 9700 Pro, but not by nearly enough to justify its 30% price premium.

If you want a fast Radeon and can afford a 9800 Pro, get a 9700 Pro instead, and you'll probably have enough money left over to buy 512Mb of system memory, or an 80Gb hard drive.

The 256Mb version of the 9800 Pro is another animal again. If you want to run ludicrous texture detail in current 3D games with all pretty-sliders set to max, it's the card to get; why wait for the GeForce FX 5900 Ultra?

With a price tag well over $US500, though, you've really got to be more than slightly nutty to seriously consider it.

The newer 9600 Pro (reviews here and here) is far saner, but still not a great buy. It's an up-clocked 9500 with a 0.13 micron manufacturing process for the core, versus the 0.15 micron process for the older Radeons. The new process allows the 9600 Pro to run a core speed more than 45% faster than the 9500 Pro, but its RAM speed's only 11% faster than the 9500 Pro, and it's only got the four rendering pipelines of the regular 9500. So the 9500 Pro still outperforms it on most tests, and costs about the same.

Since I first put this page up, the 9600 Pro has turned up in Australia; you're looking at about $AU435, delivered.

Nvidia have stuck to the old clock-speed-only distinction between their Ultra and non-Ultra cards, on top of the strong similarities between all of the FXs. Things still get confusing, though, because of performance crossovers between the Ultra version of a slower card and the non-Ultra version of a faster one.

The budget MX versions of the older GeForce4 aren't really GeForce4s at all; they don't have a full DX8.1 feature set. But the GeForce FX 5200, which occupies the same spot in the FX lineup as the MX occupies in the GeForce4 range, is fully DX9 capable, just like the rest of the FX cards. FX 5200s are slow enough that DX9 eye candy may plod past at an unacceptably low frame rate, but the features are at least there.

The FX 5600 is a more realistic entry-level DX9 card. It differs from the 5800 only in the number of rendering pipelines (5800: 8, 5600: 4) and memory speed. It's fast enough to run your Half-Life 2s and your DOOM IIIs with all of the fancy-stuff turned on, at a respectable resolution, and with FSAA, even.

(We've been promised that a "700MHz processor" and "DirectX 6 capable video card" are the minimum requirements for Half-Life 2. A "large handkerchief with string tied to the corners" can similarly be described as the minimum requirement for skydiving. But if you've got yourself a 1GHz-plus Athlon or Duron, or any P4 or P4 Celeron, and anything from a GeForce4 MX, you won't be kicked out of the HL2 playground.)

The Radeon 9500 Pro fits the bill, too; considering that the ATI drivers don't yet play well with the DOOM III alpha test version, the results the cheaper ATI cards deliver are more than acceptable, and the price/performance ratio for 9500 Pros and FX 5600s looks likely to be too close to call for real DX9 tasks.

But, as I write this, the Half-Life 2 release date is still four months away. The latest ETA for DOOM III is mid-November.

That's a long time in graphics-card-land, people. The amount of DX9 compatibility that most gamers are likely to need before then will fit in a matchbox. With room for some matches.

So there's still room for the good old ("old", here, means "two years") GeForce4.

Full, non-MX GeForce4s are not slow graphics cards. If you've got one now, there's precious little reason to upgrade just yet, no matter how nuts for performance you are. A Ti4200 will stride past an FX 5200 in non-DX9 tests (as you'd expect it to, since it still costs more than half again as much), and it'll give an FX 5600 a decent run for its money, too. A humble Radeon 9500 Pro will whip any GeForce4 Ti you like once anti-aliasing and anisotropic filtering enter the equation, but the Ti4200's considerably cheaper.

So if you're shopping for a new video card, the GeForce4 is still a decent choice. The new wave of fancy games that actually make use of DX9 will still work fine on a Ti4200, and you'll get more frame rate per dollar than you'll get out of any GeForce FX - or any Radeon.

Right now, 128Mb Ti4200-8X (the AGP-8X capable second generation Ti4200, with slightly higher clock speeds than the original) cards go for around $AU250. That price isn't quite into budget-card territory, but it's very reasonable for what you get.

But lo, there's the Ti4800SE, which is just a Ti4400 with AGP 8X. And the "full" Ti4800, which is a Ti4600 with AGP 8X.

The Ti4400-speed GeForce4s run at 275MHz core, 550MHz RAM (including DDR doubling); the Ti4600-speed ones run at 300MHz core, 650MHz RAM. If this suggests to you that all of these cards are likely to perform about the same, you're right; the Ti4800SE is only about 10% faster than the Ti4200-8X, and the Ti4800 has a sub-10% core speed boost and an 18% RAM speed boost, which add up to a further 10% or so in the benchmarks, with a following wind.

Given that you're going to pay easily 30 to 40% more for a 4800SE as you will for a 4200-8X with similar features, the cheaper card's clearly the better deal.

I've got to talk about the Ludicrous GeForce FX Cards somewhere, so I'll do it here.

GeForceFX 5800 Ultra

This, as every graphics card fetishist knows (it's not all that weird a fetish, if compared with some others), is a GeForce FX 5800 Ultra.

If built to Nvidia's reference design, the FX 5800 Ultra not only takes up two slots, but also features the frankly outrageous "FlowFX" cooling system.

This blower-pumped copper monster pushes the 5800 Ultra's weight to a gargantuan 607 grams (1.34 pounds). Its ridiculousness is so undeniable that even Nvidia staff have publicly taken the piss out of it.

5800 Ultra back

Nvidia are so proud of the 5800 Ultra - which never made it to the retail market in quantity, and was narrowly beaten by the far less flashy top-end Radeons - that the FX 5800 page is no longer linked to from the main FX page. The PR department is busy trying to erase the 5800 Ultra from everyone's memory. There never was a 5800. Just a 5900.

GeForce FX 5800

This is the rather more sensible regular 5800. Plenty of these have been sold; their more modest clock speed was good for chip yields, so the plain 5800s have gotten onto the shelves just fine.

GeForce FX 5800 rear

The difference between the 5800 Ultra and the regular 5800 is bigger than the difference between the Radeon 9700 Pro and 9800 Pro. At high resolutions, it stretches out to about 20%. This means there's not a lot of reason to buy a 5800 now, but they do serve as a good guide to how the fifty-nine-hundred cards look.

GeForce FX 5800 cooler

Nvidia's ditched the FlowFX idea for the 5900s, but the double-wide design stays. This isn't a big problem. Many high-end video cards have chunky coolers on them that block the slot next door anyway, it's generally not a great idea to put something in that slot even if you can (that used to be a recipe for system resource conflicts, but nowadays it's just bad ventilation juju), and the double tab helps to solidly secure these video cards that're all several times heavier than anybody imagined when AGP slots were young.

Nvidia partisans are hanging out for the 5900 (regular or Ultra), and it'll actually be a non-crazy option for anybody who's already made the basically unbalanced decision to get a seriously high end 256Mb-memory graphics adapter. The 5900 compares pretty well with the 9800 Pro, and the 256Mb versions of the two of them should sell for around the same money. When, of course, you can buy 5900s.

Those of us who prefer a card closer to the middle of that that steep and ugly price/performance curve, though, have now largely overcome our skepticism about ATI making drivers that work (they were never as bad as Matrox back in the G200 days, but then again, neither's gangrene), and are getting mid-range Radeons instead.

Refresh rate shenanigans

The following section is a public service announcement.

Current NT-series Windows flavours - Win2000 and WinXP - generally let you bludgeon the OS into running your monitor at a decent refresh rate in 2D mode. It's not as straightforward as it should be, as I discuss here, but at least it's possible, using the regular display setup functions.

If you want to set refresh rate in 3D mode, though, you're in for all the fun of the fair.

For a long time, you simply had to use outboard software to get 3D in Win2000/XP to run at something other than 60Hz. The best of the utilities that evolved to do this was, and is, RefreshForce.

If you're using the current Nvidia drivers, you ought not to need RefreshForce; those functions have finally been rolled into the driver. If you're using the current ATI drivers, though, you may find that not even RefreshForce solves the problem.

I'll cut to the chase. The program you want is RefreshFix. It may also help to go to Display Properties -> Settings -> Advanced -> Displays -> Monitor, disable the "Use DDC information" option, then manually enter the maximum resolution and refresh rate you want.

And now, the part you've all been waiting for.

What to buy

What video card am I using right now, in my faster-than-stock P4 box?

Why, this one.

Sapphire Radeon 9700 Pro Ultimate Edition

You are, at this juncture, allowed to say "Whoa".

This is a Sapphire Radeon 9700 Pro Ultimate Edition. With no fan. Instead, it's got, um, bodacious, heat sinks on either side of it...

Ultimate Edition side view

...connected to each other by a heat pipe. The RAM has ordinary passive heat sinks on it; all of that aluminium is just for the main chip.

(Sapphire didn't invent this cooler, by the way; it's a Zalman ZM80.

Fanless heat sinks may be quiet, but they're much less effective than fanned ones; even in plain old 2D mode, this card's heat sink gets up to around 30ºC above ambient. So if your PC case isn't well ventilated, this isn't the graphics card for you.

The Ultimate Edition 9700 Pro weighs a scale-bending 558 grams (1.23 pounds). Not as much as a standard GeForce FX 5800, I grant you, but enough to make you nervous when you're installing it.

When I first reviewed it, this card cost a similarly hefty $AU716.10 delivered from Aus PC Market. But never mind the price; feel the gravitational field.

In case you care, the Ultimate Edition card also comes with a passable bundle. As well as the drivers, you get a 99.997% empty CD containing Sapphire's "Redline" tweak utility, which will allow you to explore the not-very-exciting overclocking potential of the 9700 Pro, and fiddle with hidden driver features as well. It looks silly, but it's decently functional; you can, for instance, set different tweak profiles to automatically be applied to different games. Redline's 1.84Mb install directory's still a bit of a waste of a whole CD, though.

More entertainingly, you get Return To Castle Wolfenstein and Soldier of Fortune II. Neither of which are what you'd call new, but they're certainly not your usual bargain-bin shovelware. RTCW's got groovy multiplayer (an enhanced variant of which is now available for free, mind you) and cybernetic demon Nazis. And even the reviewers that didn't like SOF2 said it featured "an unprecedented level of brutally graphic violence". If that ain't enough for you, then you're just not a gamer.

You also get the proper suite of video connector stuff - one Y/C cable, one composite video cable, one adapter to convert the extra-pinned Y/C output to a composite RCA socket, and the sometimes-omitted-and-surprisingly-expensive DVI-to-VGA adapter that you'll need if you want to run two non-DVI monitors from the 9700 Pro's twin outputs.

Oh, and there's also Yet Another Copy Of PowerDVD. It's a great program, but you'll get another copy of it with practically every video card and CD/DVD drive you buy. Shingle your house with the spares.

If a 9700 Pro is the card for you, but you don't need one that's heavy enough to stun a burglar, then you can pay less. When I first reviewed this, Aus PC were selling the ATI-branded 9700 Pro (sorry, but it doesn't come with "ATI Factory Racing Team" stickers) for $AU596.20 delivered.

Now, purely for the sake of argument, I'll assume that you're not me, and haven't settled on a 9700 Pro. What else is good?

Well, in the bargain basement category, you've got your GeForce4 MXs.

When I first wrote this, Aus PC Market had 64Mb GeForce4 MX cards for $AU107.80 delivered. For your $AU107.80, you got an Albatron MX440SEN1, powered by the MX440SE chip. The SE is a hair slower than the original MX440, but is AGP-8X-capable. There's no significant performance difference between the SE and the original 440. Steer clear of the MX420, though; that's got much slower memory.

If you've blown your budget on the steaming new motherboard, CPU and RAM that you've installed in your funkalicious aluminium case, but still need a half-decent video card to tide you over until you Simply Must Have A DX9 Capable One, then the MX440SE is just what you need. Upgrade to something much faster when the DX9 game of your choice hits the streets, and you'll not only save money ($[MX440SE now] + $[Radeon/FX Whatever then] < $[Radeon/FX Whatever now]), but end up with a spare video card into the bargain.

If you want DX8 performance at the low end then you want a Radeon 9000, 9000 Pro or 9100 (which is, I remind you, really a reborn 8500). They weren't a good idea when ATI's drivers were still... quirky... but now they are.

128Mb of video memory is another thing that was overkill when the Radeon was new but is somewhat useful now; for a 128Mb Radeon 9000, you ought to be looking at around $AU200 delivered.

If you want DX9 with a manageable speed penalty, though (Note: "manageable" is not a well-defined term), then you want the GeForce FX 5200. In the USA, the low-end Radeons (especially the 64Mb models) are likely to be enough cheaper than the FX 5200 that they're clearly better value; here in Australia the prices are likely to be closer, which makes the decision tougher.

Anyway, the 5200 is worth considering if you're not a gaming enthusiast, or just can't afford a faster DX9 card. Keep your resolution and FSAA enthusiasm in check and a 5200 will handle everything that comes out for the next couple of years.

Further up the performance ladder, if you're after maximum frame rate for minimum money, then you want a GeForce4 Ti4200. OK, it's not be one of those glamorous new DirectX 9 confections, but it gets the job done.

In the meantime, there's the Albatron Ti4280PV, which has TV in as well as out, and the conventional one-VGA, one-DVI monitor outputs.

Want a 256Mb card for phat textures, but don't want to spend huge dollars?

Get a 256Mb GeForce FX 5600.

And that, boys and girls, is about it. As of now.

It'll All Be Different in six months, of course. Then I may have to do this all again.

Should I be ashamed to admit that I'll enjoy it?


Buy stuff!
Readers from Australia or New Zealand can purchase all kinds of video cards from Aus PC Market.
Click here!
(if you're NOT from Australia or New Zealand, Aus PC Market won't deliver to you. If you're in the USA, try a price search at DealTime!)

Want a PC to put it in?

This guide may help.



Give Dan some money!
(and no-one gets hurt)