NVIDIA Single Card, Multi-GPU: GeForce 7950 GX2
by Derek Wilson on June 5, 2006 12:00 PM EST- Posted in
- GPUs
F.E.A.R. Performance
These benchmarks stand to show that F.E.A.R. is in no way a CPU limited game. Even at 1280x1024, the 7950 GX2 shows about a 28% performance lead over the 7900 GTX. Enabling AA pushes that performance lead up over 46%. The point proved here is that there will come a time when more games will demand 7950 GX2 level power at even common desktop resolutions. F.E.A.R. is currently one of the exceptions to the rule, so we still don't recommend that gamers who commonly play at resolutions below 1600x1200 invest in this level of hardware.
The multi-GPU solutions continue to excel under F.E.A.R. as resolution in creases. While the 7900 GT and X1900 GT start to become borderline playable at 1600x1200 with 4xAA, there 7900 GT SLI and 7950 GX2 are still butter smooth.
Even at 2048x1536 with 4xAA, NVIDIA's new high end flagship sails on at an enjoyable 45 fps. The gap between the X1900 XT starts to close at this resolution, dropping back down to only about a 30% lead, but this is understandable considering the volume of data that needs to be sent from one GPU to another. Added stress on memory bandwidth could also be the reason we see the 7900 GT SLI closing the performance gap between itself and the 7950 GX2 (which has a 120Mhz lower effective data rate off of each GPU).
These benchmarks stand to show that F.E.A.R. is in no way a CPU limited game. Even at 1280x1024, the 7950 GX2 shows about a 28% performance lead over the 7900 GTX. Enabling AA pushes that performance lead up over 46%. The point proved here is that there will come a time when more games will demand 7950 GX2 level power at even common desktop resolutions. F.E.A.R. is currently one of the exceptions to the rule, so we still don't recommend that gamers who commonly play at resolutions below 1600x1200 invest in this level of hardware.
The multi-GPU solutions continue to excel under F.E.A.R. as resolution in creases. While the 7900 GT and X1900 GT start to become borderline playable at 1600x1200 with 4xAA, there 7900 GT SLI and 7950 GX2 are still butter smooth.
Even at 2048x1536 with 4xAA, NVIDIA's new high end flagship sails on at an enjoyable 45 fps. The gap between the X1900 XT starts to close at this resolution, dropping back down to only about a 30% lead, but this is understandable considering the volume of data that needs to be sent from one GPU to another. Added stress on memory bandwidth could also be the reason we see the 7900 GT SLI closing the performance gap between itself and the 7950 GX2 (which has a 120Mhz lower effective data rate off of each GPU).
60 Comments
View All Comments
kilkennycat - Monday, June 5, 2006 - link
Just to reinforce another poster's comments. Oblivion is now the yardstick for truly sweating a high-performance PC system. A comparison of a single GX2 vs dual 7900GT in SLI would be very interesting indeed, since Oblivion pushes up against the 256Meg graphics memory limit of the 7900GT (with or without SLI), and will exceed it if some of the 'oblivion.ini' parameters are tweaked for more realistic graphics in outdoor environments, especially in combo with some of the user-created texture-enhancements mods.Crassus - Monday, June 5, 2006 - link
That was actually my first thought and the reason I read the article ... "How will it run Oblivion?". I hope you'll find the time to add some graphs for Oblivion. Thanks.TiberiusKane - Monday, June 5, 2006 - link
Nice article. Some insanely rich gamers may want to compare the absolute high-end, so they may have wanted to see 1900XT in Crossfire. It'd help with the comparison of value.George Powell - Monday, June 5, 2006 - link
Didn't the ATI Rage Fury Maxx post date the Obsidian X24 card?Also on another point its a pity that there are no Oblivion benchmarks for this card.
Spoelie - Monday, June 5, 2006 - link
Didn't the voodoo5 post date that one as well? ^^Myrandex - Monday, June 5, 2006 - link
For some reason page 1 and 2 worked for me, but when I tried 3 or higher no page would load and I received a "Cannot find server" error message.JarredWalton - Monday, June 5, 2006 - link
We had some server issues which are resolved now. The graphs were initially broken on a few charts (all values were 0.0) and so the article was taken down until the problem could be corrected.ncage - Monday, June 5, 2006 - link
This is a very cool but what would be a better idea if nvidia would use the socket concept where you can change out the VPU just like you can a cpu. So you could buy a card with only one VPU and then add another one later if you needed it....BlvdKing - Monday, June 5, 2006 - link
Isn't that what PCI-Express is? Think of a graphics card like a slot 1 or slot A CPU back in the old days. A graphics card is a GPU with it's own cache on the same PCB. If we were to plug a GPU into the motherboard, then it would have to use system memory (slow) or use memory soldiered onto the motherboard (not updatable). The socket idea for GPUs doesn't make sense.DerekWilson - Monday, June 5, 2006 - link
actually this isn't exactly what PCIe is ...but it is exactly what HTX will be with AMD's Torrenza and coherent HT links from the GPU to the processor. The CPU and the GPU will be able to work much more closely together with this technology.