More Mainstream DX10: AMD's 2400 and 2600 Series
by Derek Wilson on June 28, 2007 8:35 AM EST- Posted in
- GPUs
The Test and Power
We will only be looking at DX9 performance under Windows XP today. This is still the platform of choice for gamers, and thus very important to examine. This doesn't mean we are ignoring DX10. We have a follow-up article on DX10 performance coming down the pipe next week. Here we'll take a look at how these cards stack up against the currently available DX10 games and demos.
We are also planning to look at UVD vs. PureVideo in a follow up article. Video decode is an important feature of these cards and we are interested in seeing how NVIDIA and AMD hardware stacks up against each other. Please stay tuned for this article as well.
For this series of tests, we used the following setup:
Performance Test Configuration:
As for power, the 65nm AMD hardware shows rather unimpressive results. At idle, both the 8600 GTS and 8600 GT draw less power than the 2600 XT and 2600 Pro respectively. Under load we see the AMD parts become more competitive in terms of low power. Not even 65nm can help push the 2600 XT past the 8600 GTS in terms of power draw though.
As for our game tests, first we'll take a look at how only the new AMD HD series parts stack up against NVIDIA's 8 series competitors. Following that we'll break down test by game and show performance verses previous and current generation hardware.
We will only be looking at DX9 performance under Windows XP today. This is still the platform of choice for gamers, and thus very important to examine. This doesn't mean we are ignoring DX10. We have a follow-up article on DX10 performance coming down the pipe next week. Here we'll take a look at how these cards stack up against the currently available DX10 games and demos.
We are also planning to look at UVD vs. PureVideo in a follow up article. Video decode is an important feature of these cards and we are interested in seeing how NVIDIA and AMD hardware stacks up against each other. Please stay tuned for this article as well.
For this series of tests, we used the following setup:
Performance Test Configuration:
CPU: | Intel Core 2 Extreme X6800 (2.93GHz/4MB) |
Motherboard: | ASUS P5W-DH |
Chipset: | Intel 975X |
Chipset Drivers: | Intel 8.2.0.1014 |
Hard Disk: | Seagate 7200.7 160GB SATA |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2) |
Video Card: | Various |
Video Drivers: | ATI Catalyst 8.38.9.1-rc2 NVIDIA ForceWare 158.22 |
Desktop Resolution: | 1280 x 800 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
As for power, the 65nm AMD hardware shows rather unimpressive results. At idle, both the 8600 GTS and 8600 GT draw less power than the 2600 XT and 2600 Pro respectively. Under load we see the AMD parts become more competitive in terms of low power. Not even 65nm can help push the 2600 XT past the 8600 GTS in terms of power draw though.
As for our game tests, first we'll take a look at how only the new AMD HD series parts stack up against NVIDIA's 8 series competitors. Following that we'll break down test by game and show performance verses previous and current generation hardware.
96 Comments
View All Comments
valnar - Friday, June 29, 2007 - link
Maybe I'm the opposite of most people here, but I'm glad ATI/AMD and Nvidia both produced mid-range cards that suck. Maybe we will finally get the game developers to slow down and produce tighter code or not waste GPU/CPU cycles on eye-candy and actually produce better game play. While I understand that most game companies write games that play acceptably on the $400 flagship video cards, I for one am not one of those people. It's not that I can't afford to buy a $400 card once in a awhile - it's having to spend that every year that ticks me off. I'd much rather upgrade my card every year to keep up the times if said card was $120.titan7 - Saturday, June 30, 2007 - link
Most game developers already do that. If you don't have the power to run the shaders and enable d3d10 features you can run in d3d9 mode. If your card still doesn't have the power for that you can run in pixel shader 1 mode.Take a game like Half Life 2 for example. Turn everything up and it was too much for most high end cards when it shipped. But you can turn it down so it looks like a typical d3d8 or d3d7 era game and play it just fine on your old hardware.
If you're hoping that developers can somehow make things run just as well on a PentiumIII as on a core2 duo you're hoping for the impossible. The 2600 only has about 1/3 the processing power as a 2900. The 2400 has about 10% of the power! Think about underclocking your CPU to 10% speed and seeing how your applications run ;)
Thank goodness we can disable features.
DerekWilson - Friday, June 29, 2007 - link
PLEASE READ THE UPDATE AT THE BOTTOM OF PAGE 1I would like to apologize for not catching this before publication, but the 8600 GTS we used had a slight overclock resulting in about 5% across the board higher performance than we should have seen.
We have re-run the tests on our stock clocked 8600 GTS and edited all the graphs in our article to reflect the changes. The overall results were not significantly affected, but we are very interested in being as fair and acurate as possible.
We have also added idle and load power numbers to The Test page.
Again, I am very sorry for the error, and I will do my best to make sure this doesn't happen again.
coldpower27 - Friday, June 29, 2007 - link
Meh, thanks Derek, but if you already have Factory overclocked results it may as well be lovely to leave them in as they are fair game, if Nvidia's partners are selling them in those configurations. This is of course in addition to the Nvidia reference clock rates.DerekWilson - Friday, June 29, 2007 - link
the issue is that overclocked 8600 GTS parts generally go for closer to $200, putting them well out of the price range the 2600 XT is expected to hit.it's not a fair comparison to make at this point (expecting the 2600 XT to come in at <= $150 anyway.
coldpower27 - Saturday, June 30, 2007 - link
Yeah, but you have all kinds of GPUs on the chart anyway from many different price points, 7950 GT is not close to $150 either, and neither is the 8800 GTS 320.I think people would be quite aware that the Factory OC cards if placed are indeed priced higher but if you already have the results leave them in, in addition to the Nvidia reference clock designs.
dm0r - Friday, June 29, 2007 - link
And please, keep us informed about performance with new drivers because im really interested in midrange video cards :)harpoon84 - Friday, June 29, 2007 - link
For gods sake, isn't it obvious by now that running DX10 games on these cards will result in LOWER performance, not HIGHER? If you are averaging 30fps @ 1280x1024 on DX9 games, it's only gonna get worse in DX10!http://www.extremetech.com/article2/0,1697,2151677...">http://www.extremetech.com/article2/0,1697,2151677...
Company of Heroes DX10 - SINGLE DIGIT FRAMERATES!!!
Yes, the 2600 cards are twice as fast as the 8600 cards, but we are talking totally unplayable framerates of 5 - 9 FPS!
Yeah, designed for DX10 alright! /SARCASM
Wheres TA152H now huh?
frostyrox - Friday, June 29, 2007 - link
To TA152H:Hi. It should've been painfully obvious about 10 comments ago that nobody here agrees with or, well, understands anything your saying. Can you please stop commenting on hardware articles when you don't know what you're talking about? To say that dx9 benchmarks aren't important or, heck, not the most important aspect of these cards makes 0 sense. These cards might be dx10 capable, but they obviously haven't even been given the hardware or raw horsepower to even handle dx9 (even at common resolutions). It also makes 0 sense whatsoever to even suggest these cards will somehow magically perform drastically different in pure dx10 games.
Furthermore...
How does it make any sense for AMD/ATI to have a card thats over 400 dollars (2900xt) that trades blows with the 320 and 640mb 8800gts (which are cheaper), but then have nothing between that card and a heavily hardware-castrated x2600xt for 150 dollars (a 250~ dollar difference). Also consider the fact that Nvidias dx10 mid-low end cards have matured, so even if you were froogle and wanted the cheapest possible clownshoes videocard around you should just go Nvidia. Don't even bother calling me a fanboy unless you feel like making me laugh. I currently own a radeon x800xl. I'm just being honest. It's about time the rest of you do the same. Rant over.
GlassHouse69 - Thursday, June 28, 2007 - link
Crossfire gets better results than SLI always.... two crossfire 2600XT cards would take MUCH less wattage than any other option around for its framerate. At least some azn websites are noticing.