Far Cry 2 Dissected: Massive Amounts of Performance Data
by Derek Wilson on November 21, 2008 5:30 AM EST- Posted in
- GPUs
Benchmarking Software: an Analysis of Far Cry 2 Settings under AMD and NVIDIA
Before we get started, let's take a look at our test setup:
Platform: ASUS Rampage II Extreme, Core i7-965, 6GB DDR3-1333, Intel SSD
AMD Driver: Final 8.10 hotfix
NVIDIA Driver: 180.44
Our first goal in getting our testing rolling was to find out what to test and to understand the settings in the game better. We spent time playing the game at different quality levels with different hardware like we generally do. But because we wanted to take advantage of the benchmark tool, we decided to collect a bit of data on different settings with one card from AMD and one card from NVIDIA. We look at three different quality levels under two different DX APIs with two different AA settings across five different resolutions. For those keeping count, that's 60 tests per card or 120 tests total for this section.
The result isn't as much like our usual hardware focused tests, as this provides us with more of an analysis of the game itself. We get a better perspective on how the game responds in different situations with different hardware on different platforms without the need to test every piece of hardware out there. Our hope was that this page could help people who are running a particular setup see generally how performance might change if they tweaked one of the variables. Of course, you can't predict specific performance with this, as there isn't enough data for interpolation purposes, but knowing the general trend and what sort of changes make the largest differences can still be useful.
This test is run with our custom timedemo rather than any of the built in benchmarks.
The cards we chose are the highest end NVIDIA and AMD single GPU solutions (the GeForce GTX 280 and the Radeon HD 4870 1GB). While not everyone will have these cards, we were able to test the broadest range of playable data with them. We'll start our analysis with the NVIDIA hardware in DX9 and DX10.
Now take a deep breath because these graphs can be a little tricky. Each graph is only 6 resolution scaling lines, but you'll want to approach them by looking at two groups of three: blue diamonds, red squares, green triangles are no antialiasing, while purple X, blue *, orange circles are 4xAA.
Under DX9 and NVIDIA hardware, High quality performs significantly higher than Very High quality both with and without AA. Moving from Very High quality to High quality gives at best a 47% increase in performance while the worst case is 27% with 4xAA and 37% without. Performance increases in this case generally trend downward as resolution increases. We also see that High quality 4xAA outperforms Very High quality with no AA. While there is a crossover point, Very High quality with 4xAA also performs very similarly to Ultra High quality with no AA.
Moving to DX10 under NVIDIA hardware, High quality performance takes a dive while the rest of the numbers stay relatively stable. This basic indication here is that DX9 won't gain you much performance (and will sometimes drop your performance a bit) unless you are looking at High quality mode at which case it could be very worth it to run DX9. As a further consequence, the performance benefit of dropping down to High quality in DX10 mode makes it essentailly useless. High quality with 4xAA looses the advantage over Very High quality with no AA. Very High quality or better is the way to go under DX10, and DX9 should only be paired with High quality mode or lower.
The analysis of the AMD data is very similar to what we see with NVIDIA. We see the same big performance advantage of High quality DX9 with DX10 actually increasing performance at the higher quality levels (the exception is at 2560x1600 where performance drops off more sharply than the GTX 280). The major difference here is in the fact that moving from Ultra High quality to Very High quality gives you a much larger performance increase under AMD than NVIDIA. This means that Very High 4xAA has a larger advantage over Ultra High with no AA (except at 2560x1600), and that it is more worth it to drop back to a lower quality setting to gain performance on AMD hardware. We still recommend Ultra High quality though, unless 4xAA is something you just can't live with out (in that case, Very High quality plus 4xAA is probably the way to go).
The comparison we haven't made yet is NVIDIA versus AMD. These tests show that under DX10 AMD Radeon HD 4870 1GB is either higher performing than or performing on par with the NVIDIA GeForce GTX 280 (except at ultra high resolutions with 4xAA). This is very impressive due to the $100 price advantage (the GeForce GTX 280 comes in at 33% more expensive than the Radeon HD 4870 1GB at $400). If you've got a 2560x1600 monitor and want to run Ultra High quality with 4xAA, that's the only case where the GeForce GTX 280 is worth it, though you'll be pushing the playability limit there and SLI with two cheaper cards might be a better way to go.
Going forward, we will be looking at DX10 with Ultra High quality settings and will generally favor testing without AA as we feel that Ultra High quality is a better use of resources than 4xAA. For multi-GPU and high end testing, we will still try to include 4xAA numbers though. This custom timedemo will also be the test we stick with rather than the built in RanchSmall demo.
78 Comments
View All Comments
kr7400 - Tuesday, December 2, 2008 - link
Can you please fucking die? Preferably by getting crushed to death in a garbage compactor, by getting your face cut to ribbons with a pocketknife, your head cracked open with a baseball bat, your stomach sliced open and your entrails spilled out, and your eyeballs ripped out of their sockets. *beep* bitch
I would love to kick you hard in the face, breaking it. Then I'd cut your stomach open with a chainsaw, exposing your intestines. Then I'd cut your windpipe in two with a boxcutter. Then I'd tie you to the back of a pickup truck, and drag you, until your useless *beep* corpse was torn to a million *beep* useless, bloody, and gory pieces.
Hopefully you'll get what's coming to you. *beep* bitch
I really hope that you get curb-stomped. It'd be hilarious to see you begging for help, and then someone stomps on the back of your head, leaving you to die in horrible, agonizing pain. *beep*
Shut the *beep* up f aggot, before you get your face bashed in and cut to ribbons, and your throat slit.
You're dead if I ever meet you in real life, f ucker. I'll f ucking kill you.
I would love to f ucking send your f ucking useless ass to the hospital in intensive care, fighting for your worthless life.
http://www.youtube.com/watch?v=Po0j4ONZRGY">http://www.youtube.com/watch?v=Po0j4ONZRGY
I wish you a truly painful, bloody, gory, and agonizing death, *beep*
helldrell666 - Monday, November 24, 2008 - link
Hey, if you don't own an ATI card then don't talk.I run this game at 1920x1280 res. with all the setts set to ultra high + 8x/16x aa/af and my 4870 1G toxic plays the game pretty well with very good frame rates.SiliconDoc - Saturday, November 29, 2008 - link
Well at least we know a Toxic (Sapphire) works, but on what motherboard (and perhaps ram though less needed as a clue) we still don't know.I guess after this I'll search your profile for your "rig" - and if that comes up empty I won't buy a 4870 1G Toxic because I don't know what motherboard/chipset the drivers are working on.
Nvidia says to you "Thanks for all the help".
JonnyDough - Sunday, November 23, 2008 - link
"It is worth noting that this is the kind of issue that really damages AMD's credibility with respect to going single card CrossFire on the high end. We absolutely support their strategy, but they have simply got to execute."LOL! "Simply got to execute?" You can't even execute properly English!
JonnyDough - Sunday, November 23, 2008 - link
"This type of a fumble is simply unacceptable." - the last sentence of that paragraph. ROFL.GTVic - Sunday, November 23, 2008 - link
You can't complain about debatable ATI driver problems when you have the other graphics company paying money for the developer to fully test and optimize the game against their drivers.Also, as a general comment, why is it always the graphics card designer's problem when a game has problems. I don't have to upgrade my printer drivers every time I install a new application that has printing capabilities. There is something off about the PC gaming graphics card and the PC gaming industries.
Genx87 - Monday, November 24, 2008 - link
1. There doesnt appear to be anything to debate. They see the problems and continue to see the problems.2. The Nvidia program only helps with code optimizations. Provided ATI is staying within DX10 specifications it shouldnt have a problem running the code. In fact in the past ATI cards have run very well and sometimes even beat Nvidia cards in games within this program.
3. When printer drivers are doing the workload and function of a graphics driver let us know. Until then it is pretty silly to compare a printer driver with a graphics driver.
sbuckler - Monday, November 24, 2008 - link
All I want a game that runs on my graphics card, I don't really care how that was achieved.I don't think Nvidia do *pay* the games company to make the game run better. They do however invest time and effort with that company to make the game run well on their cards, which costs Nvidia.
Ati users shouldn't be complaining about TWIMTBP, they should be asking why Ati aren't doing the same thing because it works.
SiliconDoc - Friday, November 28, 2008 - link
Good comment, and corrrect, the problem is of course those without the problem for whatever reason chime in as if it doesn't exist for anyone else.Last time I checked the videocards are sold under more than one manufacturer/brand name, and Derek pointed out ATI needs to test under a wider variety of hardware configurations.
So good job on the printer driver comment, and you hit the nail on the head - for some reason ATI is blwoing their driver releases.
No doubt it is very complex and difficult to achieve a good driver with stability across many games and platforms, and for whatever reason ATI just can't handle it right now.
It's too bad people can't admit that.
I think it would be quite wonderful, considerate, and INTELLIGENT, if the people chiming in that their ATI 4870 or whatever ran fine - that they had the sense - especially here- to post the brand and the rest of their setup so others looking to buy and looking at this review and having or not having problems can make a logical, reasonable, helpful analysis - and choose the right brand or combo setup.
Sad, though, I haven't seen that - just a sort of dissing (Mine works fine! What the xxx xxx xx xx )- that isn't helpful at all - and if ATI techs are reading, they get no clue from all of it either - what brand and board and setup is doing what well.
It's not very bright, it's quite selfish.
Oh well, the worst of it is - it will help things to stay in a bad way for too many ATI users - and then without some miracles from the driver dev team - rinse and repeat is coming along - over and over again.
atakiii - Sunday, November 23, 2008 - link
I'm not entirely sure whether Mr. Wilson fully understands the AMD/ATI driver release cycle."Maintaining a monthly driver release schedule is detrimental to AMD's ability to release quality drivers. This is not the first or only issue we've seen that could have been solved (or at least noticed) by expanded testing that isn't possible with such tight release deadlines."
This passage implies that all the development and testing for a particular release occurs in the month prior to release. This is highly unlikely, and this (http://www.phoronix.com/scan.php?page=article&...">http://www.phoronix.com/scan.php?page=article&... article from Phoronix shows that each driver is in development and testing for about 11 weeks.
Obviously, hotfixes won't follow this release cycle and newer games won't be properly optimised until the driver release with a development phase corresponding to the game's release.