F.E.A.R. GPU Performance Tests: Setting a New Standard
by Josh Venning on October 20, 2005 9:00 AM EST- Posted in
- GPUs
No Soft Shadows and No AA/AF Performance Tests
This is the setting that we would recommend most people establish. Anisotropic filtering can be enabled for a minimal performance drop, but many people just won't have hardware that can handle soft shadows or antialiasing. Also note that we have made quite a big deal about the fact that soft shadows in this game just aren't worth it.
Without soft shadows or AA enabled, you can see that until you get to a resolution as high as about 1024x768, all of these cards get playable framerates. The X1300 Pro is the most limited of these cards and can really only play this game well at 640x480 and 800x600 without AA or soft shadows enabled. But you can see how low the framerates are in general at 1280x1024 and 1600x1200 for all the cards, giving us a first look at the graphical demands of this game. The 7800 GTX and 7800 GT get very respectable framerates at 1600x1200 though, and at this same resolution, the 6800 GT and X1800 XL are low, but still playable. 25 fps is about as low as we would consider barely "playable", but to really enjoy the gameplay, you'll want a framerate of about 35 fps. Now, let's look at what happens when we turn on 4xAA and 8xAF.
This is the setting that we would recommend most people establish. Anisotropic filtering can be enabled for a minimal performance drop, but many people just won't have hardware that can handle soft shadows or antialiasing. Also note that we have made quite a big deal about the fact that soft shadows in this game just aren't worth it.
Without soft shadows or AA enabled, you can see that until you get to a resolution as high as about 1024x768, all of these cards get playable framerates. The X1300 Pro is the most limited of these cards and can really only play this game well at 640x480 and 800x600 without AA or soft shadows enabled. But you can see how low the framerates are in general at 1280x1024 and 1600x1200 for all the cards, giving us a first look at the graphical demands of this game. The 7800 GTX and 7800 GT get very respectable framerates at 1600x1200 though, and at this same resolution, the 6800 GT and X1800 XL are low, but still playable. 25 fps is about as low as we would consider barely "playable", but to really enjoy the gameplay, you'll want a framerate of about 35 fps. Now, let's look at what happens when we turn on 4xAA and 8xAF.
117 Comments
View All Comments
Regs - Friday, October 21, 2005 - link
This is one of the reasons why I don't think the 7800 GTX or x1000's are worth buying. I feel sorry for the people who payed over 600 dollars for them when they can't even play FEAR @ 1280x1024 with AA.fogeyman - Friday, October 21, 2005 - link
FEAR is quite clearly optimized poorly. However, claiming that people pay over $600 for a gtx without being able to play 1280x1024 with AA is totally wrong. It is easily playable as the review shows for less than $600 (price-wise, at least for the gtx). Not to mention, you can even kick up the resolution to 1600x1200 and get only slightly unusable FPS.Specifically, on 1280x1024 with all settings on max except for soft shadows, the GTX gets a playable 39 fps. ATI is off the mark, but NVIDIA is okay. And as for the cost of the 7800GTX, it is (as of now off Newegg) in $20 intervals from $460-$500, the $500 version includes BF2, and one $580 version. Clearly, you can get the GTX for over $100 less than your "$600" price. And no, exaggerating by over $100 is not negligible, not at all.
Note: what I mean by "slightly unusable" is not that it is slightly problematic, but rather that it is in fact unplayable but misses the playability mark by only a little.
LoneWolf15 - Friday, October 21, 2005 - link
I would argue that if anything, it is likely that F.E.A.R. was optimized poorly, and is more likely the result. I've seen screenshots, and so far, I'm not impressed enough to put down the money. Greaphics doesn't seem to be anywhere near as good as the hype has stated (previous Half-Life 2 shots look far better IMO; perhaps I have to play it to see). Add that to the fact that there's already a 1.01 patch the day of the game release, and I think that's a symptom of a game that needs more under-the-hood work. I'll wait to see the results of testing for more games; one is not enough.
P.S. To all that said this review should have had more ATI cards, you were right on the money. This review has the Geforce 6600GT and 6800GT, and doesn't even include ATI counterparts to them (read: X800GT, X800XL)? That's poor.
Jackyl - Friday, October 21, 2005 - link
I really do think developers have either reached the limit in optimizing their code, or they are too lazy to do so. Or perhaps, it's a conspiracy between ATI/Nvidia and developers? The fact is, you shouldn't NEED a $600 video card to run some of these games coming out today. The shear lack of performance shown here on a high dollar card, shows us that something is wrong in the industry.Anyone notice a trend here in the industry? Supposedly the GPU power of the cards are increasing. X800 claims to be two times faster than an "old" 9800 Pro. Yet the game engines being written today, can't crank out more than 40fps at a measly resolution of 1280x1024? Something is wrong in the industry. As someone else said in another post...Something as got to give.
Le Québécois - Friday, October 21, 2005 - link
The problem is simple...PC game developers have no limite to speak of...They know there is allways something new coming up who will run their game perfectly...That's not the case with the console market. Since they're going to be "stuck" with the same HW for 4-5 years they HAVE to optimize their code..That why you see games on a same system ( Gamecube for exemple ) with graphic twice has beautiful as other older game running on the SAME HW...Take RE4 for exemple..nobody even though that level of graphic could be achive on a GC....but it did.
g33k - Friday, October 21, 2005 - link
I can't really complain as the 6800gt was included in the article. Good read, I enjoyed it.PrinceGaz - Friday, October 21, 2005 - link
I'd say this was a fairly good performance review except for the choice of graphics-cards.An excellent choice of current nVidia cards by including both 7800 models, and popular GF6 cards (6800GT and 6600GT) from which the performance of other 6800/6600 can be extrapolated. Given the use of a PCIe platform, the only cards I would add would be a standard 6200 (not TC) and a PCX5900; the PCX5900 would give FX5900 owners a good idea of how their card would perform and be a guide to general GF5 series performance. A 7800GTX SLI setup is also needed to show what benefit it offers, but I wouldn't bother testing anything slower in SLI as it is not a viable upgrade.
The ATI X1000 series cards included was also excellent, but only using an X800GT from the previous generation was woefully inadequate. Ideally an X850XT, X800XL, and X700Pro would also be added to give more complete information. For the generation before that, just as a PCX5900 could be used for nVidia, an X600Pro/XT could be used for ATI as that would be equivalent to a 9600Pro/XT. It's a pity there isn't a PCIe version of the 9800Pro but a 9600Pro/XT would be the next best thing. Until you can setup a Crossfire X1800XT there is no point including any Crossfire tests.
So my recommended gfx-card selection is: nVidia 7800GTX SLI, 7800GTX, 7800GT, 6800GT, 6600GT, 6200, PCX5900. ATI X1800XT, X1800XL, X1600XT, X1300Pro, X850XT, X800XL, X800GT, X700Pro, X600Pro/XT. That may seem a daunting list but it is only a total of 16 instead of 10 cards so it is not overwhelming. All the cards are PCIe so you only need the one box, and it includes a good selection of old and new cards.
The only other thing I'd change is the test system. The FX-55 processor is fine though an FX-57 would be even better; people who suggest using something slower when testing slower video-cards are missing the point of a video-card review. I would up the memory to 2GB (2x 1GB) though just to remove possible stuttering from affecting the results, even if that means slowing the timings slightly to 2-3-2.
Le Québécois - Friday, October 21, 2005 - link
Oh..and your selection of video card seems pretty good to me :P Since pple with a 9800PRO will perform closely with the X700PRO.Le Québécois - Friday, October 21, 2005 - link
The fastest CPU is good if you want to know exactly how well a GPU do in a game...but that still doesn't refelct the majority of peoples who will run the game...that's why a slower CPU could be nice. If hte idea behind this review was to show peoples how well their HW will do in this game...only using the best of the best is not the best way to achive that goal.PrinceGaz - Friday, October 21, 2005 - link
The aim of video-card reviews is to show as best as possible what the video-card is capable of when all other variables (such as CPU limitations) are removed from the equation. That's why even testing an AGP GeForce 2GTS with a high-end FX-57 processor would be preferable as the performance is determined entirely by the graphics-card.If you use slower CPUs with slower graphics-cards, it is difficult to say for sure whether it is the CPU or the graphics-card that is the limiting factor. All a review which tries to mix and match CPUs and graphics cards is saying is "this combination went this fast, but we have no idea if it was the CPU or the graphics-card that was the limiting factor, so we don't know if you should buy a faster CPU or a faster graphics-card".