Splinter Cell: Double Agent: A Performance Analysis
by Josh Venning on December 8, 2006 2:10 AM EST- Posted in
- GPUs
Low End Cards
Since the second benchmark is so much more demanding graphically than the first, some of the lower-end cards had trouble running at anything above 800x600 with the quality settings on highest. For these cards, we only included tests at two resolutions (640x480 and 800x600) instead of three.
Something interesting here is that the game gets better performance out of even the lowest-end ATI card (X1300 XT) than the 7300 GS and GT cards, and even the 7600 GS trails the X1300 XT. On the very low end, the 7300 GT barely manages to run at 800x600, and with these high quality settings, you might still be able to enjoy the game at this resolution. The 7300 GS isn't so lucky, and it had to be excluded from the second test because it couldn't run the game any higher than 640x480 (and even then performance was poor), making it unfit for enabling the high quality settings regardless of resolution.
Switching to medium quality settings, in the first benchmark the 7300 GS gets a little improvement in performance, but still has trouble running above 800x600. Across the cards though, we do see improvements of up to ~40% in performance with NVIDIA and ATI by just disabling trilinear filtering, high quality soft shadows, and high detail shader in the display menu.
Notice that the 7300 GS, while barely being able to run the cruise ship benchmark at 800x600 with the highest quality settings, gets a respectable framerate at 1024x768 with the shadow effects turned off. The performance differences here are extreme, with the medium quality settings easily causing over a 100% improvement in performance over high quality. Turning off these effects will give more flexibility for running the game at higher resolutions, but we still recommend leaving them on and playing at lower resolutions if your card permits it.
Since the second benchmark is so much more demanding graphically than the first, some of the lower-end cards had trouble running at anything above 800x600 with the quality settings on highest. For these cards, we only included tests at two resolutions (640x480 and 800x600) instead of three.
Something interesting here is that the game gets better performance out of even the lowest-end ATI card (X1300 XT) than the 7300 GS and GT cards, and even the 7600 GS trails the X1300 XT. On the very low end, the 7300 GT barely manages to run at 800x600, and with these high quality settings, you might still be able to enjoy the game at this resolution. The 7300 GS isn't so lucky, and it had to be excluded from the second test because it couldn't run the game any higher than 640x480 (and even then performance was poor), making it unfit for enabling the high quality settings regardless of resolution.
Switching to medium quality settings, in the first benchmark the 7300 GS gets a little improvement in performance, but still has trouble running above 800x600. Across the cards though, we do see improvements of up to ~40% in performance with NVIDIA and ATI by just disabling trilinear filtering, high quality soft shadows, and high detail shader in the display menu.
Notice that the 7300 GS, while barely being able to run the cruise ship benchmark at 800x600 with the highest quality settings, gets a respectable framerate at 1024x768 with the shadow effects turned off. The performance differences here are extreme, with the medium quality settings easily causing over a 100% improvement in performance over high quality. Turning off these effects will give more flexibility for running the game at higher resolutions, but we still recommend leaving them on and playing at lower resolutions if your card permits it.
28 Comments
View All Comments
frostyrox - Wednesday, December 13, 2006 - link
The PC gaming scene is slowly becoming a joke, and this is coming from a avid pc gamer. Nvidia and ATi release 10 different tiers of cards completely ripping off all of us because only 2 out of the 10 cards can actually play games well and last at least a year before they force another upgrade down our throats. I'm not buying it anymore. And Ubisoft releasing games that don't have any support for Shader 2.0 cards (Rainbow Six Vegas and Double Agent) when many many people are still using these cards because they're really not that old or slow. And THEN the games come out buggy as hell because they were designed for consoles and weren't properly optimized for PCs. Anyone else notice Rainbow Six Vegas PC has a PATCH out before the gamespot.com review is even up for the game? Hahaha. PC gaming scene is a joke, and the jokes on all of us. The question is whether us gamers are gonna take it anymore. I'm not.frostyrox - Wednesday, December 13, 2006 - link
I'd also like to point out websites like Tomshardware and Anandtech fully know that the only reason Oblivion runs like a total turd on every videocard configuration available is because it was poorly ported over to PC. It has literally NOTHING to do with the game being "a true test for videocards" or "amazingly NASA advanced graphics LOL". But instead of being real about the whole thing, toms and anand try their hardest to not upset the bigwigs and bring attention to this fact. I suppose so they can keep getting their free test hardware and other support for their site. It's all good. Any monkey can clearly look at the game and see the truth. Microsoft doesn't care about gamers. About the only thing they do care about is "beating sony and nintendo" (which they wont, and will never ever do). This is exactly why Oblivion was an extremely rushed title full of bugs, glitches and overall turd performance. I'm finished ranting. Have a Nice Day.lemonadesoda - Sunday, December 10, 2006 - link
What on earth is the reviewer doing by testing different cards BUT ON a very very high end CPU? I really cannot imagine ANYONE with such a CPU using a low end card.The tests are not helpful for the typical user. It would have been much better to do the tests with a typical cpu (e.g. P4 or D at 3.0Ghz) with all these cards. That way the typical user gets an idea how the gamne will perform on their EXISTING system or with a GPU upgrade.
Alternatively, take a typical GPU, say X800 or X1650 or X1950 and test with different CPUs, e.g. P4 3.0 and CD 2.0, and C2D 3.0 to get an idea how the game will perform on a typical PC or with a CPU upgrade.
Josh Venning - Sunday, December 10, 2006 - link
Thanks for the comment. For this review, our focus was on how Double Agent performs across different graphics cards. A faster CPU gives us more flexibility when testing, because we wouldn't be able to see the real difference in how high end graphics cards can handle the game. For lower end CPUs, a slower CPU won't have as much of an impact because the game will already be GPU limited rather than CPU limited. We may see slightly lower results, but really the only thing a slower CPU would do is obscure the difference between graphics cards. This is how we have approached all of our graphics hardware reviews over the past few years, and how we will continue to test graphics cards in the future. The idea is to eliminate as many other bottlenecks as possible so we can look at the capabilities of the hardware we are trying to study.Double Agent CPU performance is definitely something we could look at in a future article, but we will be waiting for Ubisoft to fix some of the problems that make this game difficult to test.
Obviously, when making a buying descision, all aspects of a system must be taken into account. We can't review every possible system (the combiniations are way too numerous), but we can review a huge number of individual components and know where the bottleneck would be before we build a system.
Xcom1Cheetah - Saturday, December 9, 2006 - link
can the power requirement of the GPU cards be checked along the tests. Just wanted to know how much difference is between 7900GS and X1950 wrt power requirement...Btw very well covered article...
Rand - Friday, December 8, 2006 - link
It would have been nice to see some GeForce6 series graphics cards tested, their still in a considerable number of systems and are SM 3.0 capable.I'm also rather disappointed only one processor was tested, I think it would be worthwhile to get a gauge of CPU dependency in the game especially as related to the individual graphics cards.
JarredWalton - Friday, December 8, 2006 - link
Typically we either do a look at GPU performance with one CPU, or a look at CPU performance with one GPU (usually after determining the best GPU for a game). Benching a selection of GPUs and CPUs all at the same time is simply impractical. Running four resolutions, two levels, and two/three detail settings with 10 GPUs already means doing about 200 test configurations (give or take). Now if you wanted to test those with 5 CPUs....Anyway, maybe Josh can look at a separate CPU scaling article in the near future if there's enough interest in that. If SCDA becomes part of our standard benchmark suite, it will also be covered with CPU launches in the future. More likely is that we will use R6 Las Vegas instead (if we add something new from the Clancy game world).
poohbear - Friday, December 8, 2006 - link
why did anandtech choose this game to benchmark? It doesnt exactly stand out as a graphicly intensive game, especially since the first unreal engine 3 game is coming out in a few days (rainbow six: las vegas. i know roboblitz is the first game, but its hardly demonstrates what UE3 is capable of). I'd much rather see benchies for Rainbow six: las vegas, which will show us firsthand what kind of hardware is needed for the next year. just my 2 cents.Josh Venning - Friday, December 8, 2006 - link
Actually, we are planning to review Rainbow Six Las Vegas when we can get a hold of it, so good suggestion. :-) Double Agent may not be the most graphically intensive game ever released, but it's still a fairly high-profile release and we wanted to keep our readers informed about its performance.imaheadcase - Friday, December 8, 2006 - link
Clearly not from the screenshots, graphics don't look like anything.