Fall '06 NVIDIA GPU Refresh - Part I: GeForce 7900 GS
by Derek Wilson on September 6, 2006 9:00 AM EST- Posted in
- GPUs
The Test
Coming up with an ideal testbed for graphics card comparisons has become a bit tricky of late. In the past, we used AMD Athlon 64/X2 configurations as these were the highest performing platforms. We had the added benefit of being able to run SLI and/or CrossFire with the best chipset options for the respective GPUs. Intel's Core 2 Duo launch has muddied the waters somewhat, as we are now stuck with testing CrossFire on a non-ATI chipset, and SLI testing with Core 2 Duo requires that we use a somewhat outdated nForce4 SLI X16-based motherboard. NForce 590 SLI for Intel will become available in the near future, and although the primary difference will be in features, performance may also be better.
In the end, decisions have to be made on how to test our GPUs, and compromises may be necessary. For now, we have restricted testing to single PCI-E X16 solutions. When we provide the second part of this GPU launch covering the 7950 GT, we will also take a look at CrossFire and SLI performance from the various offerings. Here's the test configuration we used.
CPU: | Intel Core 2 Extreme X6800 (2.93GHz/4MB) |
Motherboard: | Intel D975XBX (LGA-775) |
Chipset: | Intel 975X |
Chipset Drivers: | Intel 7.2.2.1007 (Intel) |
Hard Disk: | Seagate 7200.7 160GB SATA |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2) |
Video Card: | Various |
Video Drivers: | ATI Catalyst 6.8 NVIDIA ForceWare 91.47 |
Desktop Resolution: | 1920 x 1440 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
The games we have chosen to test represent a wide variety of engines and styles. We are testing 7 games today due to the time constraints of this article. As the interest in HDR and advanced visual effects continues to rise, the tradeoff required for antialiasing is often overshadowed by the quality available from other options. This is especially true in games like Splinter Cell: Chaos Theory, Oblivion, and Black & White 2. In every game but Splinter Cell: Chaos Theory and Oblivion, we will be testing with and without 4x antialiasing. These games really shine when HDR is enabled, so we won’t bother disabling it.
In reporting our results, in hopes to increase readability, we will be including a snapshot of one resolution using our standard graphing engine graphs along side a resolution scaling line graph.
29 Comments
View All Comments
phusg - Tuesday, September 12, 2006 - link
Hi Derek,I'm a little late to the ball but still
> cheaper price tag
really grates me! I know it's pretty endemic but it's still logically incorrect. A price tag can be lower of higher, but not cheaper, unless it's the price tag being sold. It's the product itself that can be cheaper.
Cheers Derek and don't let me catch you making this one again or there'll be hell to pay ;-)
Pete
imaheadcase - Thursday, September 7, 2006 - link
Could you post a link to the bf2 demo you use, so we can compare are systems video cards to new ones?Stele - Wednesday, September 6, 2006 - link
At first glance, it seems that ATI has markedly improved their OpenGL implementation, at least for the Doom 3 engine:However, after a moment's thought considering the vast difference in performance from before, and also the following qualifiers:
one can't help but wonder - just wonder - if there's anything here that smells like the last quake.exe driver optimisation trick ... which, curiously enough, was also pulled by ATi (iirc it was during the Radeon 8500's time?). I wonder!
Ryan Smith - Wednesday, September 6, 2006 - link
There's no quackery as far as we know of. The problems with City of Heroes is a shader corruption bug, and a bug related to rendering on a secondary buffer, according to Cryptic(the developers of CoH). Whatever ATI did to speed up OpenGL performance here, they apparently didn't take in to account CoH.Stele - Thursday, September 7, 2006 - link
Excellent! Am deciding between the X1900GT and 7900GS (when the latter shows up in the channels), and this improvement would help strengthen the case for the X1900 a bit. :)S3anister - Wednesday, September 6, 2006 - link
found an XFX version on this card on newegg for 189MIR.http://www.newegg.com/Product/Product.asp?Item=N82...">http://www.newegg.com/Product/Product.asp?Item=N82...
emilyek - Wednesday, September 6, 2006 - link
A worthless sku. x1900gt and x1800xt/gto2 are better and almost $50 cheaper.sharkdude - Wednesday, September 6, 2006 - link
The Oblivion percentages are the same in this graph as in the graph on page 4 for all resolutions when in fact only the 800x600 numbers should be the same. On page 5 the numbers should be 4.1%, 10.1%, 6.4%, and 7.3% for 800x600, 1024x768, 1280x1024, and 1600x1200. Note the text below the chart should also change 15% to 10%.DerekWilson - Wednesday, September 6, 2006 - link
corrected -- but your number for 16x12 appears to be wrong as well. :-)Lifted - Wednesday, September 6, 2006 - link
Thanks for including the 6600 and 6800 cards in the benchmarks.