ATI's New High End and Mid Range: Radeon X1950 XTX & X1900 XT 256MB
by Derek Wilson on August 23, 2006 9:52 AM EST- Posted in
- GPUs
The Test
For the most part, this is a high end article focusing on the faster 3 cards ATI announced today. We will include benchmarks of the X1900 XT 256MB in both our high end tests, and in a comparison with the numbers we ran for our recent summer midrange roundup. Our high end tests will consist of higher resolutions and will use the same high end platform we employed for our midrange article. This time, along with the benefits we see from using the fastest CPU we can get our hands on, this is also the type of system we might recommend for high end gamers to run their cards in. Thus, people interested in these cards can get a glimpse of what actual performance might look like on their personal system using our numbers.
CPU: | Intel Core 2 Extreme X6800 (2.93GHz/4MB) |
Motherboard: | Intel D975XBX (LGA-775) ASUS P5N32SLI SE Deluxe |
Chipset: | Intel 975X NVIDIA nForce4 Intel x16 SLI |
Chipset Drivers: | Intel 7.2.2.1007 (Intel) NVIDIA nForce 6.86 |
Hard Disk: | Seagate 7200.7 160GB SATA |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2) |
Video Card: | Various |
Video Drivers: | ATI Catalyst 6.8 NVIDIA ForceWare 91.33 |
Desktop Resolution: | 1920 x 1440 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
The games we have chosen to test represent a wide variety of engines and styles. We are testing 7 games today due to the time constraints of this article. As the interest in HDR and advanced visual effects continues to rise, the tradeoff required for antialiasing is often overshadowed by the quality available from other options. This is especially true in games like Splinter Cell: Chaos Theory, Oblivion, and Black & White 2. In every game but Splinter Cell: Chaos Theory and Oblivion, we will be testing with and without 4x antialiasing. These games really shine when HDR is enabled, so we won't bother disabling it. (ATI still offers the "Chuck Patch" to enable both HDR and antialiasing, which can be seen as an advantage for their hardware. However, this doesn't work with all HDR modes and is currently targetted mostly at Oblivion and Splinter Cell: Chaos Theory.)
For all of our tests, the only default driver setting we change is vsync which we set to off. All other settings are left alone, as the default settings from each camp yeild generally comparable image quality. There are a few exceptions to the rule, but none of the test we ran show any shimmering or other problems noted in the past with NVIDIA's default quality.
In reporting our results, in hopes to increase readability, we will be including a snapshot of one resolution using our standard graphing engine graphs along side a resolution scaling line graph.
74 Comments
View All Comments
SixtyFo - Friday, September 15, 2006 - link
So do they still use a dongle between the cards? If you had 2 xfire cards then it won't be connecting to a dvi port. Is there an adaptor? I guess what I'm asking is are you REALLY sure I can run 2 crossfire ed. x1950s together? I'm about to drop a grand on video cards so that piece of info may come in handy.unclebud - Friday, September 1, 2006 - link
"And 10Mhz beyond the X1600 XT is barely enough to warrant a different pair of letters following the model number, let alone a whole new series starting with the X1650 Pro."nvidia has been doing it for years with the 4mx/5200/6200/7300/whatever and nobody here said boo!
hm.
SonicIce - Thursday, August 24, 2006 - link
How can a whole X1900XTX system use only 267 watts? So a 300w power supply could handle the system?DerekWilson - Saturday, August 26, 2006 - link
generally you need something bigger than a 300w psu, because the main problem is current supply on both 12v rails must be fairly high.Trisped - Thursday, August 24, 2006 - link
The crossfire card is not the same as the normal one. The normal card also has the extra video out options. So there is a reason to buy the one to team up with the other, but only if you need to output to a composite, s-video, or component.JarredWalton - Thursday, August 24, 2006 - link
See discussion above under the topic "well..."bob4432 - Thursday, August 24, 2006 - link
why is the x1800xt left out of just about every comparison i have read? for the price you really can't beat it....araczynski - Thursday, August 24, 2006 - link
...I haven't read the article, but i did want to just make a comment...having just scored a brand new 7900gtx for $330 shipped, it feels good to be able to see the headlines for articles like this, ignore them, and think "...whew, i won't have to read anymore of these until the second generation of DX10's comes out..."
I'm guessing nvidia will be skipping the 8000's, and 9000's, and go straight for the 10,000's, to signal the DX10 and 'uber' (in hype) improvements.
either way, its nice to get out of the rat race for a few years.
MrJim - Thursday, August 24, 2006 - link
Why no Anisotropic filtering tests? Or am i blind?DerekWilson - Saturday, August 26, 2006 - link
yes, all tests are performed with at least 8xAF. Under games that don't allow selection of a specific degree of AF, we choose the highest quality texture filtering option (as in BF2 for instance).AF comes at fairly little cost these days, and it just doesn't make sense not to turn on at least 8x. I wouldn't personally want to go any higher without angle independant AF (like the high quality af offered on ATI x1k cards).