ATI's New Leader in Graphics Performance: The Radeon X1900 Series
by Derek Wilson & Josh Venning on January 24, 2006 12:00 PM EST- Posted in
- GPUs
Image Quality, Feature Tests, and Power
Something we'd like to look at a bit more in-depth for this review is image quality. It's no secret that due to ATI and NVIDIA's differences in rendering graphics, there is always going to be some variation in the look of the graphics from one brand to another. Most times this variation is too subtle to notice, but upon closer inspection, certain patterns tend to emerge.
With Black and White 2, we can see how well the in-game maximum AA does at cleaning up the image. Note how there is a significant difference between the edges in the pictures without AA and with "high" AA enabled by the game. However, we don't see the same kind of difference between the image without AA enabled and the one with maximum quality enabled (in the graphics driver). This is a good example of in-game AA doing a much better job, quality and performance-wise, than the max quality settings in the control panel. We suspect that Black and White 2 has implimented a custom AA algorithm and has issues running stock MSAA algorithms. For this reason we recommend using the Black and White 2's in-game AA instead of the control panel's AA settings.
Both ATI and NVIDIA hardware look great and render similar images, and luckily for ATI there is an upcoming patch that should improve performance.
Battlefield 2 gives us a good view of how the maximum quality settings in the control panel (specifically transparency AA) fix certain graphical problems in games. Fences in particular have a tendency to render inaccurately, especially when looking through them at certain angles. While you can see that the in-game AA without adaptive or transparency AA cleans up a lot of jagged edges (the flag pole for instance), it still has trouble with parts of the fence.
As for power, we ran the multitexturing and pixel shader feature tests under 3dmark06 and measured the maximum powerload via our trusty Kill-A-Watt. This measures power at the wall before the PSU, so it doesn't focus only on the graphics cards.
We can see the CrossFire and SLI systems pull insane ammounts of power, but even as a single card the X1900 XTX is a very hungry part.
Something we'd like to look at a bit more in-depth for this review is image quality. It's no secret that due to ATI and NVIDIA's differences in rendering graphics, there is always going to be some variation in the look of the graphics from one brand to another. Most times this variation is too subtle to notice, but upon closer inspection, certain patterns tend to emerge.
Hold your mouse over the links below to see Image Quality (Right Click the links to download the full-resolution images):
ATI | |||
NVIDIA |
With Black and White 2, we can see how well the in-game maximum AA does at cleaning up the image. Note how there is a significant difference between the edges in the pictures without AA and with "high" AA enabled by the game. However, we don't see the same kind of difference between the image without AA enabled and the one with maximum quality enabled (in the graphics driver). This is a good example of in-game AA doing a much better job, quality and performance-wise, than the max quality settings in the control panel. We suspect that Black and White 2 has implimented a custom AA algorithm and has issues running stock MSAA algorithms. For this reason we recommend using the Black and White 2's in-game AA instead of the control panel's AA settings.
Both ATI and NVIDIA hardware look great and render similar images, and luckily for ATI there is an upcoming patch that should improve performance.
Hold your mouse over the links below to see Image Quality (Right Click the links to download the full-resolution images):
Hold your mouse over the links below to see Image Quality (Right Click the links to download the full-resolution images):
Battlefield 2 gives us a good view of how the maximum quality settings in the control panel (specifically transparency AA) fix certain graphical problems in games. Fences in particular have a tendency to render inaccurately, especially when looking through them at certain angles. While you can see that the in-game AA without adaptive or transparency AA cleans up a lot of jagged edges (the flag pole for instance), it still has trouble with parts of the fence.
As for power, we ran the multitexturing and pixel shader feature tests under 3dmark06 and measured the maximum powerload via our trusty Kill-A-Watt. This measures power at the wall before the PSU, so it doesn't focus only on the graphics cards.
We can see the CrossFire and SLI systems pull insane ammounts of power, but even as a single card the X1900 XTX is a very hungry part.
120 Comments
View All Comments
poohbear - Tuesday, January 24, 2006 - link
$500 too much? there are cars for $300, 000+, but u dont see the majority of ppl complaining because they're NOT aimed at u and me and ferrari & lamborghini could care less what we think cause we're not their target audience. get over yourself, there ARE cards for you in the $100+ $300, so what are u worried about?timmiser - Tuesday, January 24, 2006 - link
While I agree with what you are saying, we are already on our 3rd generation of $500 high end graphic cards. If memory serves, it was the Nvidia 6800 that broke the $500 barrier for a single card solution.I'm just happy it seems to have leveled off at $500.
Zebo - Tuesday, January 24, 2006 - link
Actually GPU's in general scale very well with price/performance and this is no exception. Twice as fast as a 850 XT which you can get for $275 should cost twice as much or $550 which it does. If you want to complain about prices look at CPUs, high end memory and raptors/SCSI which higher line items offer small benefits for huge price premiums.fishbits - Tuesday, January 24, 2006 - link
Geez, talk about missing the point. News flash: Bleeding edge computer gear costs a lot. $500 is an excellent price for the best card out. Would I rather have it for $12? Yes. Can I afford/justify a $500 gfx card? No, but more power to those who can, and give revenue to ATI/Nvidia so that they can continue to make better cards that relatively quickly fall within my reach. I can't afford a $400 9800 pro either... whoops! They don't cost that much now, do they?Short-sighted again. Look at the launch of Unreal games for instance. Their code is always awesome on the performance side, but can take advantage of more power than most have available at release time. You can tell them their code is shoddy, good luck with that. In reality it's great code that works now, and your gaming enjoyment is extended as you upgrade over time and can access better graphics without having to buy a new game. Open up your mind, quit hating and realize that these companies are giving us value. You can't afford it now, neither can I, but quit your crying and applaud Nv/ATI for giving us constantly more powerful cards.
aschwabe - Tuesday, January 24, 2006 - link
Agreed, I'm not sure how anyone constitutes $500 for ONE component a good price. I'll pay no more than 300-350 for a vid card.bamacre - Tuesday, January 24, 2006 - link
Hear, hear!! A voice of reason!rqle - Tuesday, January 24, 2006 - link
I like new line graph color and interface, but i like bar graph so much more. Never a big fan over SLI or Crossfire on the graph, makes its a distracting, especially it only represent a small group. Wonder if crossfire and sli can have their own graph by themselves or maybe their own color. =)DerekWilson - Tuesday, January 24, 2006 - link
it could be possible for us to look at multigpu solutions serpeately, but it is quite relevant to compare single card performance to multigpu performance -- especially when trying to analyze performance.Live - Tuesday, January 24, 2006 - link
Good reading! Good to see ATI getting back in the game. Now lets see some price competition for a change.I don’t understand what CrossFire XTX means. I thought there was no XTX crossfire card? Since the Crossfire and XT have the same clocks it shouldn’t matter if the other card is a XTX. By looking at the graphs it would seem I was wrong but how can this be? This would indicate that the XTX has more going for it then just the clocks but that is not so, right?
Bha I'm confused :)
DigitalFreak - Tuesday, January 24, 2006 - link
My understanding is that Crossfire is async, so both cards run at their maximum speed. The XTX card runs at 650/1.55, while the Crossfire Edition card runs at 625/1.45. You're right, there is no Crossfire Edition XTX card.