Splinter Cell: Double Agent: A Performance Analysis
by Josh Venning on December 8, 2006 2:10 AM EST- Posted in
- GPUs
NVIDIA AA Performance
Antialiasing was something that we wanted to look at with Double Agent, and we found there were some quirks with AA in the game. The game was released without any in-game AA options, but the 1.01 patch added the setting to the game menu. With NVIDIA hardware, Double Agent is unable to run AA with HDR effects enabled, which is something we saw in previous Splinter Cell games. ATI hardware, on the other hand, has no trouble running the game with both AA and HDR effects enabled, but unlike NVIDIA ATI cards had trouble running AA with HDR disabled. For this reason, we split up our AA tests into separate sections for NVIDIA and ATI. We aren't able to do an apples to apples comparison of how the game performs with AA between ATI and NVIDIA hardware, but we can see what kind of performance impact we will see with AA enabled for both types of cards.
We tested AA performance on the first benchmark, with Sam Fisher sliding down the zip-line at night. We saw that disabling HDR didn't make a very large performance impact on the game, and with NVIDIA hardware the game took a considerable performance hit with AA enabled. The fact that the game currently supports resolutions only up to 1600x1200 makes AA something that could be more useful, at least for those with larger displays and the GPU to handle it.
With AA enabled and HDR off, the cards see much lower frame rates, with only cards like the 7900 GTX and 7950 GT really being able to run well at 1600x1200. Also, keep in mind that since we used the less demanding benchmark of the two (the other being on the cruise ship) for AA testing, performance in other areas of the game will be even worse with this option enabled. When we compare these results to our non-AA results for the same benchmark, we see that with AA off there is upwards of a 60% performance increase for most of the cards at the same resolutions. AA might still be helpful for cleaning up the jaggies on cards like the 7900 GS or 7600 GT which will still run the game more or less smoothly at 1280x1024 with AA enabled, especially if you don't have a display that supports higher resolutions.
ATI AA Performance
As we mentioned, ATI was able to run with AA and HDR enabled, but had some trouble running AA with HDR off. We saw some graphical problems which made the game unplayable at 1600x1200 resolution on all of the cards. These problems caused improper rendering of the scene making it impossible to see unless heat vision is enabled. Some of the other cards had the same problem at other resolutions as well. Therefore we tested ATI cards with HDR and AA enabled in order to see the kind of impact AA has on performance.
As with NVIDIA, turning on AA with these ATI cards does cause a large decrease in performance. The impact on performance here is greater than we saw with NVIDIA, but keep in mind that both HDR and AA are enabled for these tests. Lower performance cards like the X1300 XT just don't have the capacity to handle AA well and we see a large impact in performance here. The X1950 XTX on the other hand handles AA better, and though it still takes a performance hit it has more than enough power to keep the game running smoothly at 1600x1200 with AA enabled.
The fact that HDR seems to cause problems for ATI hardware when disabled while AA is enabled is unfortunate, but the performance difference between HDR on and off on ATI isn't that great. It would be much better to disable high quality soft shadows and/or high detail shader if you are looking to pick up some extra performance while running AA, for ATI as well as NVIDIA hardware.
Antialiasing was something that we wanted to look at with Double Agent, and we found there were some quirks with AA in the game. The game was released without any in-game AA options, but the 1.01 patch added the setting to the game menu. With NVIDIA hardware, Double Agent is unable to run AA with HDR effects enabled, which is something we saw in previous Splinter Cell games. ATI hardware, on the other hand, has no trouble running the game with both AA and HDR effects enabled, but unlike NVIDIA ATI cards had trouble running AA with HDR disabled. For this reason, we split up our AA tests into separate sections for NVIDIA and ATI. We aren't able to do an apples to apples comparison of how the game performs with AA between ATI and NVIDIA hardware, but we can see what kind of performance impact we will see with AA enabled for both types of cards.
We tested AA performance on the first benchmark, with Sam Fisher sliding down the zip-line at night. We saw that disabling HDR didn't make a very large performance impact on the game, and with NVIDIA hardware the game took a considerable performance hit with AA enabled. The fact that the game currently supports resolutions only up to 1600x1200 makes AA something that could be more useful, at least for those with larger displays and the GPU to handle it.
NVIDIA 4xAA Without HDR | |||||
640x480 | 800x600 | 1024x768 | 1280x1024 | 1600x1200 | |
NVIDIA GeForce 7300 GS | 14.1 | 9.9 | |||
NVIDIA GeForce 7300 GT | 23.3 | 17.4 | 12 | ||
NVIDIA GeForce 7600 GS | 30.9 | 23.5 | 16.6 | 11.1 | |
NVIDIA GeForce 7600 GT | 46.6 | 35.4 | 25.2 | 17.1 | |
NVIDIA GeForce 7900 GS | 58.5 | 48 | 35.3 | 23.5 | 12.4 |
NVIDIA GeForce 7950 GT | 59.8 | 57.4 | 44.3 | 31.6 | 23.4 |
NVIDIA GeForce 7900 GTX | 60.2 | 59.1 | 49.4 | 35.5 | 26.4 |
With AA enabled and HDR off, the cards see much lower frame rates, with only cards like the 7900 GTX and 7950 GT really being able to run well at 1600x1200. Also, keep in mind that since we used the less demanding benchmark of the two (the other being on the cruise ship) for AA testing, performance in other areas of the game will be even worse with this option enabled. When we compare these results to our non-AA results for the same benchmark, we see that with AA off there is upwards of a 60% performance increase for most of the cards at the same resolutions. AA might still be helpful for cleaning up the jaggies on cards like the 7900 GS or 7600 GT which will still run the game more or less smoothly at 1280x1024 with AA enabled, especially if you don't have a display that supports higher resolutions.
ATI AA Performance
As we mentioned, ATI was able to run with AA and HDR enabled, but had some trouble running AA with HDR off. We saw some graphical problems which made the game unplayable at 1600x1200 resolution on all of the cards. These problems caused improper rendering of the scene making it impossible to see unless heat vision is enabled. Some of the other cards had the same problem at other resolutions as well. Therefore we tested ATI cards with HDR and AA enabled in order to see the kind of impact AA has on performance.
ATI 4xAA With HDR | |||||
640x480 | 800x600 | 1024x768 | 1280x1024 | 1600x1200 | |
ATI Radeon X1300 XT | 32.5 | 24 | 16.3 | 11.2 | 7.2 |
ATI Radeon X1650 Pro | 36.1 | 25.7 | 18.6 | 12.7 | 8.4 |
ATI Radeon X1650 XT | 51.9 | 49.4 | 43.3 | 17.2 | 12.4 |
ATI Radeon X1900 XT 256 | 52.5 | 50.3 | 46.3 | 34.5 | 24.4 |
ATI Radeon X1950 Pro | 53.7 | 49 | 40 | 28.3 | 21.2 |
ATI Radeon X1950 XTX | 53.7 | 53.4 | 52.7 | 40.9 | 31.3 |
As with NVIDIA, turning on AA with these ATI cards does cause a large decrease in performance. The impact on performance here is greater than we saw with NVIDIA, but keep in mind that both HDR and AA are enabled for these tests. Lower performance cards like the X1300 XT just don't have the capacity to handle AA well and we see a large impact in performance here. The X1950 XTX on the other hand handles AA better, and though it still takes a performance hit it has more than enough power to keep the game running smoothly at 1600x1200 with AA enabled.
The fact that HDR seems to cause problems for ATI hardware when disabled while AA is enabled is unfortunate, but the performance difference between HDR on and off on ATI isn't that great. It would be much better to disable high quality soft shadows and/or high detail shader if you are looking to pick up some extra performance while running AA, for ATI as well as NVIDIA hardware.
28 Comments
View All Comments
frostyrox - Wednesday, December 13, 2006 - link
The PC gaming scene is slowly becoming a joke, and this is coming from a avid pc gamer. Nvidia and ATi release 10 different tiers of cards completely ripping off all of us because only 2 out of the 10 cards can actually play games well and last at least a year before they force another upgrade down our throats. I'm not buying it anymore. And Ubisoft releasing games that don't have any support for Shader 2.0 cards (Rainbow Six Vegas and Double Agent) when many many people are still using these cards because they're really not that old or slow. And THEN the games come out buggy as hell because they were designed for consoles and weren't properly optimized for PCs. Anyone else notice Rainbow Six Vegas PC has a PATCH out before the gamespot.com review is even up for the game? Hahaha. PC gaming scene is a joke, and the jokes on all of us. The question is whether us gamers are gonna take it anymore. I'm not.frostyrox - Wednesday, December 13, 2006 - link
I'd also like to point out websites like Tomshardware and Anandtech fully know that the only reason Oblivion runs like a total turd on every videocard configuration available is because it was poorly ported over to PC. It has literally NOTHING to do with the game being "a true test for videocards" or "amazingly NASA advanced graphics LOL". But instead of being real about the whole thing, toms and anand try their hardest to not upset the bigwigs and bring attention to this fact. I suppose so they can keep getting their free test hardware and other support for their site. It's all good. Any monkey can clearly look at the game and see the truth. Microsoft doesn't care about gamers. About the only thing they do care about is "beating sony and nintendo" (which they wont, and will never ever do). This is exactly why Oblivion was an extremely rushed title full of bugs, glitches and overall turd performance. I'm finished ranting. Have a Nice Day.lemonadesoda - Sunday, December 10, 2006 - link
What on earth is the reviewer doing by testing different cards BUT ON a very very high end CPU? I really cannot imagine ANYONE with such a CPU using a low end card.The tests are not helpful for the typical user. It would have been much better to do the tests with a typical cpu (e.g. P4 or D at 3.0Ghz) with all these cards. That way the typical user gets an idea how the gamne will perform on their EXISTING system or with a GPU upgrade.
Alternatively, take a typical GPU, say X800 or X1650 or X1950 and test with different CPUs, e.g. P4 3.0 and CD 2.0, and C2D 3.0 to get an idea how the game will perform on a typical PC or with a CPU upgrade.
Josh Venning - Sunday, December 10, 2006 - link
Thanks for the comment. For this review, our focus was on how Double Agent performs across different graphics cards. A faster CPU gives us more flexibility when testing, because we wouldn't be able to see the real difference in how high end graphics cards can handle the game. For lower end CPUs, a slower CPU won't have as much of an impact because the game will already be GPU limited rather than CPU limited. We may see slightly lower results, but really the only thing a slower CPU would do is obscure the difference between graphics cards. This is how we have approached all of our graphics hardware reviews over the past few years, and how we will continue to test graphics cards in the future. The idea is to eliminate as many other bottlenecks as possible so we can look at the capabilities of the hardware we are trying to study.Double Agent CPU performance is definitely something we could look at in a future article, but we will be waiting for Ubisoft to fix some of the problems that make this game difficult to test.
Obviously, when making a buying descision, all aspects of a system must be taken into account. We can't review every possible system (the combiniations are way too numerous), but we can review a huge number of individual components and know where the bottleneck would be before we build a system.
Xcom1Cheetah - Saturday, December 9, 2006 - link
can the power requirement of the GPU cards be checked along the tests. Just wanted to know how much difference is between 7900GS and X1950 wrt power requirement...Btw very well covered article...
Rand - Friday, December 8, 2006 - link
It would have been nice to see some GeForce6 series graphics cards tested, their still in a considerable number of systems and are SM 3.0 capable.I'm also rather disappointed only one processor was tested, I think it would be worthwhile to get a gauge of CPU dependency in the game especially as related to the individual graphics cards.
JarredWalton - Friday, December 8, 2006 - link
Typically we either do a look at GPU performance with one CPU, or a look at CPU performance with one GPU (usually after determining the best GPU for a game). Benching a selection of GPUs and CPUs all at the same time is simply impractical. Running four resolutions, two levels, and two/three detail settings with 10 GPUs already means doing about 200 test configurations (give or take). Now if you wanted to test those with 5 CPUs....Anyway, maybe Josh can look at a separate CPU scaling article in the near future if there's enough interest in that. If SCDA becomes part of our standard benchmark suite, it will also be covered with CPU launches in the future. More likely is that we will use R6 Las Vegas instead (if we add something new from the Clancy game world).
poohbear - Friday, December 8, 2006 - link
why did anandtech choose this game to benchmark? It doesnt exactly stand out as a graphicly intensive game, especially since the first unreal engine 3 game is coming out in a few days (rainbow six: las vegas. i know roboblitz is the first game, but its hardly demonstrates what UE3 is capable of). I'd much rather see benchies for Rainbow six: las vegas, which will show us firsthand what kind of hardware is needed for the next year. just my 2 cents.Josh Venning - Friday, December 8, 2006 - link
Actually, we are planning to review Rainbow Six Las Vegas when we can get a hold of it, so good suggestion. :-) Double Agent may not be the most graphically intensive game ever released, but it's still a fairly high-profile release and we wanted to keep our readers informed about its performance.imaheadcase - Friday, December 8, 2006 - link
Clearly not from the screenshots, graphics don't look like anything.