ATI's New High End and Mid Range: Radeon X1950 XTX & X1900 XT 256MB
by Derek Wilson on August 23, 2006 9:52 AM EST- Posted in
- GPUs
The Test
For the most part, this is a high end article focusing on the faster 3 cards ATI announced today. We will include benchmarks of the X1900 XT 256MB in both our high end tests, and in a comparison with the numbers we ran for our recent summer midrange roundup. Our high end tests will consist of higher resolutions and will use the same high end platform we employed for our midrange article. This time, along with the benefits we see from using the fastest CPU we can get our hands on, this is also the type of system we might recommend for high end gamers to run their cards in. Thus, people interested in these cards can get a glimpse of what actual performance might look like on their personal system using our numbers.
CPU: | Intel Core 2 Extreme X6800 (2.93GHz/4MB) |
Motherboard: | Intel D975XBX (LGA-775) ASUS P5N32SLI SE Deluxe |
Chipset: | Intel 975X NVIDIA nForce4 Intel x16 SLI |
Chipset Drivers: | Intel 7.2.2.1007 (Intel) NVIDIA nForce 6.86 |
Hard Disk: | Seagate 7200.7 160GB SATA |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2) |
Video Card: | Various |
Video Drivers: | ATI Catalyst 6.8 NVIDIA ForceWare 91.33 |
Desktop Resolution: | 1920 x 1440 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
The games we have chosen to test represent a wide variety of engines and styles. We are testing 7 games today due to the time constraints of this article. As the interest in HDR and advanced visual effects continues to rise, the tradeoff required for antialiasing is often overshadowed by the quality available from other options. This is especially true in games like Splinter Cell: Chaos Theory, Oblivion, and Black & White 2. In every game but Splinter Cell: Chaos Theory and Oblivion, we will be testing with and without 4x antialiasing. These games really shine when HDR is enabled, so we won't bother disabling it. (ATI still offers the "Chuck Patch" to enable both HDR and antialiasing, which can be seen as an advantage for their hardware. However, this doesn't work with all HDR modes and is currently targetted mostly at Oblivion and Splinter Cell: Chaos Theory.)
For all of our tests, the only default driver setting we change is vsync which we set to off. All other settings are left alone, as the default settings from each camp yeild generally comparable image quality. There are a few exceptions to the rule, but none of the test we ran show any shimmering or other problems noted in the past with NVIDIA's default quality.
In reporting our results, in hopes to increase readability, we will be including a snapshot of one resolution using our standard graphing engine graphs along side a resolution scaling line graph.
74 Comments
View All Comments
Vigile - Wednesday, August 23, 2006 - link
My thought exactly on this one Anand...Anand Lal Shimpi - Wednesday, August 23, 2006 - link
You can run dual monitors with a CrossFire card as well, the CrossFire dongle that comes with the card has your 2nd DVI output on it :)Take care,
Anand
kneecap - Wednesday, August 23, 2006 - link
What about VIVO? The Crossfire Edition does not support that.JarredWalton - Wednesday, August 23, 2006 - link
For high-end video out, the DVI port is generally more useful anyway. It's also required if you want to hook up to a display using HDCP - I think that will work with a DVI-to-HDMI adapter, but maybe not? S-VIDEO and Composite out are basically becoming seldom used items in my experience, though the loss of component out is a bit more of a concern.JNo - Thursday, August 24, 2006 - link
So if I use DVI out and attach a DVI to HDMI adaptor before attaching to a projector or HDTV, will I get a properly encrypted signal to fully display future blu-ray/hd-dvd encrypted content?The loss of component is a bit of a concern as many HDTVs and projectors still produce amazing images with component and, in fact, I gather that some very high resolutions+refresh rates are possible on component but not DVI due to certain bandwidth limitations with DVI. But please correct me if I am wrong. I take Anandtech's point on the crossfire card offering more but with a couple of admittedly small quesiton marks, I see no reason not to get the standard card and crossfire for the second later if you decided to go that route...
JarredWalton - Thursday, August 24, 2006 - link
I suppose theoretically component could run higher resolutions than DVI, with dual-link being required for 2048x1536 and higher. Not sure what displays support such resolutions with component inputs, though. Even 1080p can run off of single-link DVI.I think the idea with CF cards over standard is that they will have a higher resale value if you want to get rid of them in the future, and they are also more versatile -- TV out capability being the one exception. There are going to be a lot of people that get systems with a standard X1950 card, so if they want to upgrade to CrossFire in the future they will need to buy the CrossFire edition. We all know that at some point ATI is no longer going to make any of the R5xx cards, so if people wait to upgrade to CrossFire they might be forced to look for used cards in a year or two.
Obviously, this whole scenario falls apart if street prices on CrossFire edition cards end up being higher than the regular cards. Given the supply/demand economics involved, that wouldn't be too surprising, but of course we won't know for another three or four weeks.
UNESC0 - Wednesday, August 23, 2006 - link
thanks for clearing that up Anand, news to me!TigerFlash - Wednesday, August 23, 2006 - link
I was wondering if anyone thinks it's wise to get an intel core duo 2 motherboard with crossfire support now that AMD is buying out ATI. Do you think ATI would stop supporting Intel motherboards?johnsonx - Wednesday, August 23, 2006 - link
Of course not. AMD/ATI isn't stupid. Even if their cross-licensing agreement with Intel didn't prevent them from blocking Crossfire on Intel boards (which it almost surely does), cutting out that part of the market would be foolish.
dderidex - Wednesday, August 23, 2006 - link
What's with the $99 -> $249 gap?Weren't we supposed to see an X1650XT, too? Based on RV570? ...or RV560? Something?