Quad SLI with 9800 GX2: Pushing a System to its Limit
by Derek Wilson on March 25, 2008 9:00 AM EST- Posted in
- GPUs
The Setup and The Test
We did have some issues again with this one. If you’ve already got one 9800 GX2 with the driver installed, uninstall the driver, reboot, power down, plug in the second card, boot, reboot, then install the driver. Trust me, it will save you a headache. It seems NVIDIA and AMD still need some time to sort out Vista and multi-GPU when adding a card or removing a card. We didn’t have the same problems we did with CrossFireX, but the potential is there to cause some frustration.
We also ran into a huge (in our opinion) bug that was very difficult to track down. The first two things we do when our graphics driver is installed is to disable vsync and disable image scaling to fit panel size. We run with no scaling at centered timings. This affords us the ability to see things at the same DPI across the board and it also gives us the ability to tell what resolution we are running by looking at the screen. This saves us a lot of trouble when things inevitably get mucked up for one reason or another. I also tend toward the obsessive / compulsive and if I can’t see it, I need to set the res like four times just to be sure.
In any case, 2x 9800 GX2 cards in Quad SLI will not run any games at less than panel resolution if scaling is disabled. You run the game and get a black screen. If you change resoluion in the game to something lower than panel res you get a black screen. Well, to be fair, it’s not just a black screen. It’s a hard lock. This needs to be fixed. It happens on both Skulltrail and 780i, so it’s not an isolated issue.
Also, NVIDIA decided to install a link to a trial version of Portal on the user's desktop when their driver is installed. I suppose a link to the site is better than bundling Earthsim, but not even asking if their customer wants more clutter on their desktop before putting it there is terribly inappropriate. I don’t care about bundling a trial, but please ask before you put something on my desktop.
The test system we used is the same as the one from the 9800 GX2 review, as are the driver revisions.
Test Setup | |
CPU | 2x Intel Core 2 Extreme
QX9775 @ 3.20GHz |
Motherboard | Intel D5400XS (Skulltrail) |
Video Cards | ATI Radeon HD 3870
x2 ATI Radeon HD 3870 NVIDIA GeForce 9600 GT 512MB NVIDIA GeForce 8800 GT 512MB NVIDIA GeForce 8800 Ultra NVIDIA GeForce 9800 GX2 |
Video Drivers | Catalyst 8.3 ForceWare 174.53 |
Hard Drive | Seagate 7200.9 120GB 8MB 7200RPM |
RAM | 2xMicron 2GB FB-DIMM DDR2-8800 |
Operating System | Windows Vista Ultimate
64-bit SP1 |
Thanks goes out to EVGA for supplying the two 9800 GX2 units for this review.
As for power consumption, here’s what we’ve got from these beasts.
54 Comments
View All Comments
strikeback03 - Wednesday, March 26, 2008 - link
you posted your comment 20 hrs ago. I don't see any comments posted by any Anandtech staff in this review since that time, so no guarantee they have actually read your comment.gudodayn - Tuesday, March 25, 2008 - link
< Layzer253 ~ No, they shouldnt. It works just fine >Given that 9800x2 is a power card, more powerful than 3870x2 that 9800x2 will run into CPU limitations..........
Then in theory, when using the same platform (same CPU, RAM, etc.), shouldn't 9800x2 score at least within the ball park range of the 3870x2??
It would just mean with a faster CPU, 9800x2 will have lots of room for improvement whereas the 3870x2 wouldn't!!
But that's not what's happening here though, is it??
For some reason, 3870x2 in Crossfire is scaling a lot better than 9800x2 in SLI ~ in a lot of the tests!!
Either SLI drivers are messed up or 9800x2 cant run quad GPUs effectively.........
LemonJoose - Tuesday, March 25, 2008 - link
I honestly don't know why the hardware vendors keep trying to push these high end solution that aren't stable and don't work the way they are supposed to. Exactly who is the market for $500 motherboards that require expensive RAM designed for servers, $1000 CPUs that will be outperfromed by mainstream CPUs in a year or less, and $600 video cards that have stability issues and driver problems even when not run in SLI mode.I would really love it Anandtech and other hardware sites would come out and give these products the 1 out of 10 or 2 out of 10 review scores that they deserve, and tell the hardware companies to spend more time developing solutions for real entusiasts with mainstream pocketbooks, instead of wasting their engineering resources on these high end solutions that nobody wants.
strikeback03 - Wednesday, March 26, 2008 - link
Dunno if anyone has actually bought Skulltrail, but obviously people do buy $1000 CPUs and $500 video cards, as some GX2 owners have already posted in this thread, and some people previously bought the Ultra even when it was well over $500. Not something I would do, but there are obviously those with the time and money to play with this kind of thing.B3an - Tuesday, March 25, 2008 - link
To the writer of this article, or anyone else... i have this strange problem with the GX2.In the article you're getting 60+ FPS @ 2560x1200 res with 4xAA. Now if i do this i get nowhere near that frame rate. Without AA it's perfect, but with AA it completely kills it at that res.
At first i was thinking that the 512MB usable VRAM was not enough for 2560x1600 + AA so i get a slide slow with 4xAA at that res. But from these tests, you're not getting that.
What backed up my theory for this is that with games with higher res/bigger textures, even 2xAA would kill frame rates. I'd go from 60+ FPS to literally single digit FPS just by turning on 2xAA @ 2560x1200. But with older games with lower res textures i can turn the AA up a lot higher before this happens.
Does anyone know what the problem could be? Because i'd really like to run games at 2560x1600 with AA, but cannot at the moment.
I'm on Vista SP1, with 4GB RAM, and a Quad @ 3.5GHz.
I've also tried 3 different sets of drivers.
nubie - Tuesday, March 25, 2008 - link
OK, it is about time that Socketable GPU's are on the market, how about a mATX board with a edge connected PCIe x32 slot? Make the video board ATX compliant (IE video board + mATX board = ATX compliant).Then we can finally cool these damn video cards, and maybe find a way of getting them power that doesn't get in the way.
You purchase the proper daughterboard for your class (main-stream, enthusiast, overclocker/benchmarker), and it will come with the ram slots(or onboard ram) and proper voltage circuitry. Then you can change just the GPU when you need to upgrade.
I know it would be hard to implement and confusing, but it would be less confusing than the current situation once we got used to it, and it would be a hell of a lot more sane.
You could use a PCI-e extender to connect it to existing PCI-e mATX or full ATX motherboards.
It is either this, or put the GPU socket on the damn motherboard already, it is 2008, it needs to happen. (If the videocard was connected over Hypertransport III you wouldn't have any problem with bandwidth). The next step really needs to be a general-purpose processor built into the video card, that way you aren't cpu/system bound like the current ridiculous setup (The 790i chipset seems to be helping with the ability to have the northbridge do batch commands across the GPU's, but minimum we need to see some Physics moving away from the CPU, general physics haven't changed since the universe started, why waste a general purpose CPU on them?)
nubie - Tuesday, March 25, 2008 - link
Just to re-iterate, why am I buying a new card every time when they just seem to be recycling the reference design? All GPU systems need the same as a motherboard, Clean voltage to the main processor and its memory, as well as a bus for the RAM and some I/O. The 7900 8800 and 9600 are practically the same boards with the same memory bus, can't we have it socketed?tkrushing - Tuesday, March 25, 2008 - link
I am all for SLI/Crossfire or whatever you can afford to do but why are we starting to lose focus on single card solution users? I'm sure if I really wanted to sacrafice I could save up for a multiple GPU system but a single g92 for less than half the price for a relatively small performance hit in crysis. And yes I love it but I'm saying it, I just think Crysis is a poorly optimized game to some degree. Give use new and not reused single card solutions! (9 series)tviceman - Tuesday, March 25, 2008 - link
I have been thinking the same thing lately, but last week after reading about Intel's plans to use one of it's cores in CPU's as a graphics unit, I started thinking about the ramifications of this. I am willing to bet that Nvidia is trying to develop a hybrid CPU/GPU to compete on the same platform that Intel and AMD will eventually have. If this is true, that it's probably a reasonable explanation as to why there has been a severe lack of all new GPU's since the launch of the 8xxx series a year ago.dlasher - Tuesday, March 25, 2008 - link
page 1, paragraph 2, line 1:Is:
"...it’s not for the feint of heart.."
Should be:
"...it’s not for the faint of heart.."