MultiGPU Update: Two-GPU Options in Depth
by Derek Wilson on February 23, 2009 7:30 AM EST- Posted in
- GPUs
Graphics is infinitely parallel. There is always more work that can be done, and the work can be broken up into millions of completely independent operations. This is very different than most tasks we see done on the CPU that don't scale quite as easily with the number of cores. While we might see small improvements by adding another CPU, we can see nearly double the performance by doubling the number of processors in a graphics card (as long as there are no other bottlenecks anyway). This fact is why AMD and NVIDIA have invested so much money into their respective multiGPU solutions (CrossFire and SLI respectively).
MultiGPU solutions have been around for a few years now, and while we frequently include single card multiGPU solutions in our reviews, we only occasionally take an in depth look at multiGPU technology. Some time has passed since the last time we studied the issue, and now that we've fully broken in our Core i7 system, 64-bit Vista, and recent graphics drivers, it's time to get to it.
Over the past few weeks we've been benchmarking and analyzing lots of numbers. We've looked at single, two, three and four GPU systems across multiple games and resolutions. The configurations we chose to look at are current generation high-ish end hardware capable of operation in 3-way and 4-way configurations. Because of the sheer volume of data we collected, we've decided to break up our analysis into multiple articles.
This first article (the one you're reading right now) will cover single and dual GPU configurations (including single card multiGPU hardware). The next article will add 3-way solutions along with comparisons back to single and dual GPU setups. The final article will add in 4-way performance analysis and compare it back to the single, dual and 3-way data. Splitting up the analysis this way will allow us to dive deep into each type of configuration individually without spreading the content too thin. We can keep focus on a specific aspect of multiGPU performance and scaling while still making all the relevant comparisons.
The initial installment also introduces the Sapphire Radeon HD 4850 X2 2GB. Though we expected AMD to push the 4850 X2 out in the same way they launched the 4870 X2, we've only seen one version of the 4850 X2 hit the scenes late last year from Sapphire. In light of what we've seen, we are rather surprised that we haven't seen more fanfare behind this part from either AMD or other board makers. The lighter weight X2 competes more directly in price and performance to the GeForce GTX 280/285, and really fills out the lineup for AMD. Overall, the increased RAM in the 4850 X2 2GB enables great performance scaling even at resolutions the 512MB 4850 can't come close to handling.
As for the topics we'll cover, our interest will focus on scalability of the multiGPU solutions and the relative value of the same. Before jumping into the numbers, we'll cover the metrics we use to analyze our data. First, we'll look at scaling and talk about the big picture. Then we'll talk about what we've done to calculate a value comparison.
95 Comments
View All Comments
Hauk - Monday, February 23, 2009 - link
To you grammer police... get a life will ya?!?Who gives a rats ass! It's the data!
Your smug comments are of ZERO value here. You want to critique, go to a scholarly forum and do so.
Your whining is more of a distraction! How's that for gramaticly correct?
Slappi - Tuesday, February 24, 2009 - link
It should be grammar not grammer.
SiliconDoc - Wednesday, March 18, 2009 - link
Grammatically was also spelled incorrectly.lol
The0ne - Monday, February 23, 2009 - link
"In general, more than one GPU isn't that necessary for 1920x1200 with the highest quality settings,..."I see many computer setups with 22" LCDs and lower that have high end graphic cards. It just doesn't make sense to have a high end card when you're not utilizing the whole potential. Might as well save some money up front and if you do need more power, for higher resolutions later, you can always purchase an upgrade at a lower cost. Heck, most of the time there will be new models out :)
Then again, I have a qaud-core CPU that I don't utilize too but... :D
7Enigma - Monday, February 23, 2009 - link
Everyone's situation is unique. In my case I just built a nice C2D system (OC'd to 3.8GHz with a lot of breathing room up top). I have a 4870 512meg that is definitely overkill with my massive 19" LCD (1280X1024). But within the year I plan on giving my dad or wife my 19" and going to a 22-24". Using your logic I should have purchased a 4850 (or even 4830) since I don't NEED the power. But I did plan ahead to future proof my system for when I can benefit from the 4870.I think many people also don't upgrade their systems near as frequently as some of the enthusiasts do. So we spend a bit more than we would need to at that particular time to futureproof a year or two ahead.
Different strokes and all that...
strikeback03 - Monday, February 23, 2009 - link
The other side of the coin is that most likely for similar money, you could have bought something now that more closely matches your needs, and a 4870 in a year once it has been replaced by a new card if it still meets your needs.7Enigma - Tuesday, February 24, 2009 - link
Of course. Or I could spend $60 now, another $60 in 3 months, and you see the point. It's all dependant on your actual need, your perceived need, and your desire to not have to upgrade frequently.I think the 4870 is one of those cards like the ATI 9800pro that has a perfect combination of price and performance to be a very good performer for the long haul (similarly to how the 8800GTS was probably the best part from a price/performance/longevity standpoint if you were to buy it the day it first came out).
Also important is looking at both companies and seeing what they are releasing in the next 3-6 months for your/my particular price range. Everything coming out seems to be focused either on the super high end, or the low end. I don't see any significant mid-range pieces coming out in the next 3-6 months that would have made me regret my purchase. If it was late summer or fall and I knew the next round of cards were coming out I *may* have opted for a 9600GT or other lower-midrange card to hold over until the next big thing but as it stands I'll get easily a year out of my card before I even feel the need to upgrade.
Frankly the difference between 70fps and 100fps at the resolutions I would be playing (my upgrade would be either to a 22 or 24") is pretty moot.
armandbr - Monday, February 23, 2009 - link
http://www.xbitlabs.com/articles/video/display/rad...">http://www.xbitlabs.com/articles/video/display/rad...here you go
Denithor - Monday, February 23, 2009 - link
Second paragraph, closing comments:Fourth paragraph, closing comments:
Please remove the apostrophe from the first sentence (where it should read its) and instead move it to the second (which should be we're).
Otherwise excellent article. This is the kind of work I remember from years past that originally brought me to the site.
One thing - would it be too difficult to create a performance/watt chart based on a composite performance score for each single/pair of cards?
I do think you really pushed the 4850X2 a bit too much. The 9800GTX+ provides about the same level of performance (better in some cases, worse in others) and the SLI version manages to kick the crap out of the GTX 280/285 nearly across the board (with the exception of a couple of 2560x1600 memory-constricted cases) at a lower price point. That's actually in my mind one of the best performance values available today.
SiliconDoc - Wednesday, March 18, 2009 - link
Forget about Derek removing the apostrophe, how about removing the raging red fanboy ati drooling ?When the GTX260 SLI scores the 20 games runs of 21, and the 4850 DOESN'T, Derek is sure to mention not the GTX260, and on the very same page blab the 4850 sapphire "ran every test"...
This is just another red raging fanboy blab - so screw the apostrophe !
Nvidiai DISSED 'em because they can see the articles Derek posts here bleeding red all over the place.
DUH.