The Elder Scrolls IV: Oblivion GPU Performance
by Anand Lal Shimpi on April 26, 2006 1:07 PM EST- Posted in
- GPUs
Setting Expectations
While we've been used to running high end SLI setups at 3MP resolutions and still getting reasonable frame rates in games, the same is not true for Oblivion. In fact, there isn't a single GPU or pair of GPUs today that will run Oblivion at 1600 x 1200 with everything turned on smoothly. Our highest end X1900 XT CrossFire setup can't even run our most stressful real world Oblivion test at 1280 x 1024 with all of the detail settings set to their highest values. That's right, $1200 worth of GPUs will get you less than 50 fps at less than the highest image quality, and we're not talking about having AA enabled either.
With Oblivion, you've got to set your expectations appropriately for what good performance is. If your frame rate never drops below 30 fps, then you've put together a very fast system for Oblivion. The problem with Oblivion is that while performance may be in the 60 - 70 fps range indoors or while taking a stroll around town, as soon as you engage a few enemies or walk near an oblivion gate your frame rate may drop into the teens. A choppy frame rate really impacts your ability to do things like slice the correct opponent and not someone you're trying to protect. If a video card can maintain a minimum of 20 fps in our most strenuous test (the Oblivion Gate benchmark) then it will do you very well, otherwise you may want to start turning down some of the visual quality options.
Oblivion is also one of those rare games where turning down all of the image quality options not only impacts how good the game looks, but it actually can have a pretty serious impact on gameplay as well. Turning down your view distance is a sure fire way to increase performance, however the lower your view distance is the more difficult it is to spot landmarks you're searching for. You can decrease things like the distance which characters, items and other objects will appear, giving you better performance, but also putting you at a disadvantage when you're looking for a particular item or when someone is about to attack you. With Oblivion it's not all about performance and dealing with slightly blurred textures and jagged edges to maintain higher frame rates, the total experience of the game is very dependent on having a powerful system with a fast GPU.
The Test
Given that we're looking at GPU performance we tested all graphics cards with the same, very fast CPU, to minimize any CPU bottlenecks. In future articles we will look at how the CPU impacts performance under Oblivion, but for now the GPU is our focus. All ATI cards were run on a CrossFire 3200 platform while all NVIDIA cards were run on an nForce4 SLI x16 platform.
The latest drivers from ATI and NVIDIA were used, including ATI's Chuck patch for Oblivion that enables CrossFire support and AA+HDR rendering support. Given the low frame rates that we're talking about already, enabling AA simply didn't make any sense as you will see from the performance results on the coming pages. We would much rather increase detail settings than turn on AA in Oblivion.
CPU: | AMD Athlon 64 FX-60 (2.6GHz/1MBx2) |
Motherboard: | ASUS A8N32-SLI ASUS A8R32-MVP |
Chipset: | NVIDIA nForce4 SLI x16 ATI CrossFire 3200 |
Chipset Drivers: | nForce4 6.85 ATI Catalyst 6.4 |
Hard Disk: | Seagate 7200.9 300GB SATA |
Memory: | 2 x 1GB OCZ PC3500 DDR 2-3-2-7 |
Video Drivers: | ATI Catalyst 6.4 w/ Chuck Patch NVIDIA ForceWare 84.43 |
Desktop Resolution: | 1280 x 1024 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
100 Comments
View All Comments
smitty3268 - Friday, April 28, 2006 - link
Well, all the tests that had the XT ahead of the XTX were obviously CPU bound, so for all intents and purposes you should have read the performance as being equal.I would like to know a bit about the drivers though. Were you using Catalyst AI and does it make a difference?
coldpower27 - Thursday, April 27, 2006 - link
Quite a nice post there, well said Jarred.JarredWalton - Thursday, April 27, 2006 - link
LOL - a Bolivian = Oblivion. Thanks, Dragon! :D (There are probably other typos as well. Sorry.)alpha88 - Thursday, April 27, 2006 - link
Opteron 165, 7800GTX 256megI run at 1920x1200 with every ingame setting set to max, HDR, no AA, (16x AF)
The game runs just fine.
I don't know what the framerates are, but whatever they are, it's very playable.
I have a few graphics mods installed (new textures), and the graphics are good enough that I randomly stop and take screenshots, the view looked so awesome.
z3R0C00L - Thursday, April 27, 2006 - link
The game is a glimpse at the future of gaming. The 7x00 series is old. True, nVIDIA were able to remain competitive with revamped 7800's which they now call 7900's but consumers need to remember that these cards have a smaller die space for a reason... they offer less features, less performance and are not geared towards HDR gaming.Right now nVIDIA and ATi have a complete role reversal from the x800XT PE vs. 6800 Ultra. The 6800 Ultra performed on par or beat the x800XT PE. The kick was that the 6800 Ultra produced more heat (larger die) was louder (larger cooler) but had more features and was more forward looking. Right now we have the same thing.
ATi's x1900 series has a larger die, produces more heat (larger die means more voltage to operate) and comes with a larger cooler. The upside is that it's a better card. The x1900 series totally dominate the 7900 series. Some will argue about OpenGL others will point to inexistant flaws in ATi drivers... the truth is those who make these comments on both sides are hardware fans. Product wise.. the x1900 series should be the card you buy if you're looking for a highend card... if you're looking more towards the middle of the market the x1800XT is better then the 7900GT.
Remember performance, features and technology.. the x1k series has all of them above the 7x00 series. Larger die space.. more heat. Larger die space.. more features.
Heat/Power for features and performance... hmmm fair tradeoff if you ask me.
aguilpa1 - Thursday, April 27, 2006 - link
inefficient game programming is no excuse to go out and spend 1200 on a graphics system. Games like the old Crytek Cryengine have proven they can provide 100% of the oblivion immersion and eye candy without crippling your graphics system and bring your computer to a crawl, ridicoulous game and test,....nuff said.dguy6789 - Thursday, April 27, 2006 - link
The article is of a nice quality, very informative. However, what I ponder more than GPU performance in this game is CPU performance. Please do an indepth cpu performance article that includes Celerons, Pentium 4s, Pentium Ds, Semprons, Athlon 64s, and Athlon 64 X2s. Firing squad did an article, however it only contained four AMD cpus that were of relatively the same speed in the first place. I, as well as many others, would greatly appreciate an indepth article speaking of cpu performance, dual core benefits, as well as anything else you can think of.coldpower27 - Thursday, April 27, 2006 - link
I would really enjoy a CPU scaling article with Intel based processors from the Celeron D's, Pentium 4's, and Pentium D's in this game.frostyrox - Thursday, April 27, 2006 - link
It's something i already knew but I'm glad Anandtech has brought it into full view. Oblivion is arguably one of the best PC games i've seen in 2006, and could very well turn out to be one of the best we'll see all year. Instead of optimizing the game for the PC, Bethesda (and Microsoft indirectly) bring to the PC a half *ss, amature, embarassing, and insanely bug-ridden 360 Port. I think I have the right to say this because I have a relatively fast PC (a64 3700+, x800 xl, 2gb cosair, sata2 hdds, etc) and I'm roughly 65hrs into Oblivion right now. Next time Bethesda should use the Daikatana game engine - that way gamers with decent PCs might not see framerates of 75 go to 25 everytime an extra character came onto the screen and sneezed. Right now you may be thinking that I'm mad about all this. Not quite. But I will say this much: next time I get the idea of upgrading my pc, I'll have to remember that upgrading the videocard may be pointless if the best games we see this year are 360 ports running at 30 frames. So here's to you Bethesda and Microsoft, for ruining a gaming experience that could've been so much more if you gave a d*mn about pc gamers.trexpesto - Thursday, April 27, 2006 - link
Maybe Oblivion should be $100?