ATI's New Leader in Graphics Performance: The Radeon X1900 Series
by Derek Wilson & Josh Venning on January 24, 2006 12:00 PM EST- Posted in
- GPUs
Battlefield 2 Performance
Battlefield 2 has been a standard for performance benchmarks here in the past, and it's probably one of our most important tests. This game still stands out as one of those making best use of the next generation of graphics hardware available right now due to its impressive game engine.
One of the first things to note here is something that is a theme throughout all of our performance tests in this review. In all our tests we find that the X1900 XTX and X1900 XT perform very similar to each other, and in some places differ only by a couple of frames per second. This is significant considering that the X1900 XTX costs about $100 more than the X1900 XT.
Below we have two sets of graphs for three different settings: no AA, 4xAA/8xAF, and maximum quality (higher AA and AF settings in the driver). Note that our benchmark for BF2 had problems with NVIDIA's sli so we were forced to omit these numbers. We can see how with and without AA, both ATI and NVIDIA cards perform very similar to each other on each side. Generally though, since ATI tends to do a little better with AA than NVIDIA, they hold a slight edge here. With the Maximum quality settings, we see a great reduction in performance which is expected. Something to keep in mind is that in the driver options, NVIDIA can enable AA up to 8X, while ATI can only enable up to 6X, so these numbers aren't directly comparable.
Battlefield 2 has been a standard for performance benchmarks here in the past, and it's probably one of our most important tests. This game still stands out as one of those making best use of the next generation of graphics hardware available right now due to its impressive game engine.
One of the first things to note here is something that is a theme throughout all of our performance tests in this review. In all our tests we find that the X1900 XTX and X1900 XT perform very similar to each other, and in some places differ only by a couple of frames per second. This is significant considering that the X1900 XTX costs about $100 more than the X1900 XT.
Below we have two sets of graphs for three different settings: no AA, 4xAA/8xAF, and maximum quality (higher AA and AF settings in the driver). Note that our benchmark for BF2 had problems with NVIDIA's sli so we were forced to omit these numbers. We can see how with and without AA, both ATI and NVIDIA cards perform very similar to each other on each side. Generally though, since ATI tends to do a little better with AA than NVIDIA, they hold a slight edge here. With the Maximum quality settings, we see a great reduction in performance which is expected. Something to keep in mind is that in the driver options, NVIDIA can enable AA up to 8X, while ATI can only enable up to 6X, so these numbers aren't directly comparable.
120 Comments
View All Comments
poohbear - Tuesday, January 24, 2006 - link
$500 too much? there are cars for $300, 000+, but u dont see the majority of ppl complaining because they're NOT aimed at u and me and ferrari & lamborghini could care less what we think cause we're not their target audience. get over yourself, there ARE cards for you in the $100+ $300, so what are u worried about?timmiser - Tuesday, January 24, 2006 - link
While I agree with what you are saying, we are already on our 3rd generation of $500 high end graphic cards. If memory serves, it was the Nvidia 6800 that broke the $500 barrier for a single card solution.I'm just happy it seems to have leveled off at $500.
Zebo - Tuesday, January 24, 2006 - link
Actually GPU's in general scale very well with price/performance and this is no exception. Twice as fast as a 850 XT which you can get for $275 should cost twice as much or $550 which it does. If you want to complain about prices look at CPUs, high end memory and raptors/SCSI which higher line items offer small benefits for huge price premiums.fishbits - Tuesday, January 24, 2006 - link
Geez, talk about missing the point. News flash: Bleeding edge computer gear costs a lot. $500 is an excellent price for the best card out. Would I rather have it for $12? Yes. Can I afford/justify a $500 gfx card? No, but more power to those who can, and give revenue to ATI/Nvidia so that they can continue to make better cards that relatively quickly fall within my reach. I can't afford a $400 9800 pro either... whoops! They don't cost that much now, do they?Short-sighted again. Look at the launch of Unreal games for instance. Their code is always awesome on the performance side, but can take advantage of more power than most have available at release time. You can tell them their code is shoddy, good luck with that. In reality it's great code that works now, and your gaming enjoyment is extended as you upgrade over time and can access better graphics without having to buy a new game. Open up your mind, quit hating and realize that these companies are giving us value. You can't afford it now, neither can I, but quit your crying and applaud Nv/ATI for giving us constantly more powerful cards.
aschwabe - Tuesday, January 24, 2006 - link
Agreed, I'm not sure how anyone constitutes $500 for ONE component a good price. I'll pay no more than 300-350 for a vid card.bamacre - Tuesday, January 24, 2006 - link
Hear, hear!! A voice of reason!rqle - Tuesday, January 24, 2006 - link
I like new line graph color and interface, but i like bar graph so much more. Never a big fan over SLI or Crossfire on the graph, makes its a distracting, especially it only represent a small group. Wonder if crossfire and sli can have their own graph by themselves or maybe their own color. =)DerekWilson - Tuesday, January 24, 2006 - link
it could be possible for us to look at multigpu solutions serpeately, but it is quite relevant to compare single card performance to multigpu performance -- especially when trying to analyze performance.Live - Tuesday, January 24, 2006 - link
Good reading! Good to see ATI getting back in the game. Now lets see some price competition for a change.I don’t understand what CrossFire XTX means. I thought there was no XTX crossfire card? Since the Crossfire and XT have the same clocks it shouldn’t matter if the other card is a XTX. By looking at the graphs it would seem I was wrong but how can this be? This would indicate that the XTX has more going for it then just the clocks but that is not so, right?
Bha I'm confused :)
DigitalFreak - Tuesday, January 24, 2006 - link
My understanding is that Crossfire is async, so both cards run at their maximum speed. The XTX card runs at 650/1.55, while the Crossfire Edition card runs at 625/1.45. You're right, there is no Crossfire Edition XTX card.