F.E.A.R. GPU Performance Tests: Setting a New Standard
by Josh Venning on October 20, 2005 9:00 AM EST- Posted in
- GPUs
The Failure of Soft Shadows and Parallax Mapping
Soft Shadows
We've seen how soft shadows can be used effectively in games like The Chronicles of Riddick, and Monolith has decided to add this as an option in FEAR. Used correctly, soft shadows greatly enhance lighting in a game by giving shadows cast by objects different levels of darkness making them more realistic. This process can take up a lot of processing power however, and that's no exception for FEAR.
Basically, soft shadows are an effect that show how shadows tend to fade at the edges or cast overlapping lines on walls or objects depending on different factors such as light angle and distance. If you've ever made a shadow-puppet, you can see this clearly, as multiple outlines of your hand shadow overlap on the wall with varying degrees of darkness (depending on the light source). And if you were to move your hand closer or farther away from the light, you can see how the soft shadows change dynamically.
The idea is to capture this effect in a game environment, but as any programmer would know, translating this to a game engine can be a very complex undertaking. Not only that, but as we mentioned before, calculating multiple shadows in real time can quickly become a major leech of processing power. With FEAR, we've seen how big of a performance hit that we had when we enabled soft shadows, but you may wonder, "does the effect at least look good?"
The short answer is "no". The way that FEAR incorporates soft shadows ends up looking unrealistic; more stratified and strange than soft. It simply looks as though the game draws multiple shadows at the edges of objects and offsets them up, down, left and right very slightly at different degrees of darkness regardless of the light source. This wouldn't be so bad if the multiple shadows were not readily noticeable as such. It also would have been nice if the "blur factor" were more dynamic; in other words, moving the shadows closer together or farther apart given where the object (say an enemy soldier) is in relation to the light sources and shadowed surfaces.
This is difficult to understand until you see it happening in the game, but you can get a better idea of it by looking at a few pictures. We took some screen shots of a scene with and without soft shadows enabled with both NVIDIA and ATI cards. Please ignore the slight lighting and position differences of these screens.
Seeing the pictures gives you a little better idea of how the soft shadow option looks in FEAR. Since it's not impressive and it gives the game a major performance hit, we don't see any reason to enable it. It might look good with AA enabled, but unfortunately as of right now, both soft shadows and AA can't be enabled at the same time. They might allow this in some later patch, but as we've shown by our performance tests, the cost to performance would be almost too great to think about.
Parallax Mapping
While the detailed textures, excellent lighting, well done static and dynamic shadows (in spite of the soft shadow issue), large intricate particle systems, and various beautiful effects of FEAR come together to form an immersive and fluid graphical experience, there are a few caveats. To their credit, Monolith was very aggressive with the features that they included and are on the leading edge of technology. The use of a deep parallax mapping algorithm to represent damage is a very cool idea, but the implementation used in FEAR doesn't include key features such as self-occlusion and self-shadowing. When passing a wall with a chunk blown out, the hole will swim around, flatten out, and eventually look like unidentifiable goo stuck to the wall as the angle gets very steep.
The parallax mapping used looks great from angles where the entire interior of a hole can be seen. The problem occurs at viewing angles where a near edge would need to block the view of part of (or the entire) interior of the indention. Rather than occluding anything, parts of the texture that should become invisible are still shown (albeit distorted). This completely destroys the illusion of depth at steep angles by making the texture kind of swim until it totally loses its three-dimensionality. There are algorithms available that can represent correctly self-occlusion in parallax mapping. While we can appreciate cheaper parallax mapping algorithms as a kind of upgraded bump mapping, dramatic surface deformation should either be done more correctly or not at all in cases where the viewer can move to angles that break the effect.
But again, we would love to give credit where credit is due. We would rather see game developers experiment with new technology and put something out there than let the true power of our graphics cards remain dormant. Monolith was ahead of the curve with the graphics in Tron 2.0, and they haven't let us down with the quality of FEAR.
Soft Shadows
We've seen how soft shadows can be used effectively in games like The Chronicles of Riddick, and Monolith has decided to add this as an option in FEAR. Used correctly, soft shadows greatly enhance lighting in a game by giving shadows cast by objects different levels of darkness making them more realistic. This process can take up a lot of processing power however, and that's no exception for FEAR.
Basically, soft shadows are an effect that show how shadows tend to fade at the edges or cast overlapping lines on walls or objects depending on different factors such as light angle and distance. If you've ever made a shadow-puppet, you can see this clearly, as multiple outlines of your hand shadow overlap on the wall with varying degrees of darkness (depending on the light source). And if you were to move your hand closer or farther away from the light, you can see how the soft shadows change dynamically.
The idea is to capture this effect in a game environment, but as any programmer would know, translating this to a game engine can be a very complex undertaking. Not only that, but as we mentioned before, calculating multiple shadows in real time can quickly become a major leech of processing power. With FEAR, we've seen how big of a performance hit that we had when we enabled soft shadows, but you may wonder, "does the effect at least look good?"
The short answer is "no". The way that FEAR incorporates soft shadows ends up looking unrealistic; more stratified and strange than soft. It simply looks as though the game draws multiple shadows at the edges of objects and offsets them up, down, left and right very slightly at different degrees of darkness regardless of the light source. This wouldn't be so bad if the multiple shadows were not readily noticeable as such. It also would have been nice if the "blur factor" were more dynamic; in other words, moving the shadows closer together or farther apart given where the object (say an enemy soldier) is in relation to the light sources and shadowed surfaces.
This is difficult to understand until you see it happening in the game, but you can get a better idea of it by looking at a few pictures. We took some screen shots of a scene with and without soft shadows enabled with both NVIDIA and ATI cards. Please ignore the slight lighting and position differences of these screens.
Seeing the pictures gives you a little better idea of how the soft shadow option looks in FEAR. Since it's not impressive and it gives the game a major performance hit, we don't see any reason to enable it. It might look good with AA enabled, but unfortunately as of right now, both soft shadows and AA can't be enabled at the same time. They might allow this in some later patch, but as we've shown by our performance tests, the cost to performance would be almost too great to think about.
Parallax Mapping
While the detailed textures, excellent lighting, well done static and dynamic shadows (in spite of the soft shadow issue), large intricate particle systems, and various beautiful effects of FEAR come together to form an immersive and fluid graphical experience, there are a few caveats. To their credit, Monolith was very aggressive with the features that they included and are on the leading edge of technology. The use of a deep parallax mapping algorithm to represent damage is a very cool idea, but the implementation used in FEAR doesn't include key features such as self-occlusion and self-shadowing. When passing a wall with a chunk blown out, the hole will swim around, flatten out, and eventually look like unidentifiable goo stuck to the wall as the angle gets very steep.
The parallax mapping used looks great from angles where the entire interior of a hole can be seen. The problem occurs at viewing angles where a near edge would need to block the view of part of (or the entire) interior of the indention. Rather than occluding anything, parts of the texture that should become invisible are still shown (albeit distorted). This completely destroys the illusion of depth at steep angles by making the texture kind of swim until it totally loses its three-dimensionality. There are algorithms available that can represent correctly self-occlusion in parallax mapping. While we can appreciate cheaper parallax mapping algorithms as a kind of upgraded bump mapping, dramatic surface deformation should either be done more correctly or not at all in cases where the viewer can move to angles that break the effect.
But again, we would love to give credit where credit is due. We would rather see game developers experiment with new technology and put something out there than let the true power of our graphics cards remain dormant. Monolith was ahead of the curve with the graphics in Tron 2.0, and they haven't let us down with the quality of FEAR.
117 Comments
View All Comments
Regs - Friday, October 21, 2005 - link
This is one of the reasons why I don't think the 7800 GTX or x1000's are worth buying. I feel sorry for the people who payed over 600 dollars for them when they can't even play FEAR @ 1280x1024 with AA.fogeyman - Friday, October 21, 2005 - link
FEAR is quite clearly optimized poorly. However, claiming that people pay over $600 for a gtx without being able to play 1280x1024 with AA is totally wrong. It is easily playable as the review shows for less than $600 (price-wise, at least for the gtx). Not to mention, you can even kick up the resolution to 1600x1200 and get only slightly unusable FPS.Specifically, on 1280x1024 with all settings on max except for soft shadows, the GTX gets a playable 39 fps. ATI is off the mark, but NVIDIA is okay. And as for the cost of the 7800GTX, it is (as of now off Newegg) in $20 intervals from $460-$500, the $500 version includes BF2, and one $580 version. Clearly, you can get the GTX for over $100 less than your "$600" price. And no, exaggerating by over $100 is not negligible, not at all.
Note: what I mean by "slightly unusable" is not that it is slightly problematic, but rather that it is in fact unplayable but misses the playability mark by only a little.
LoneWolf15 - Friday, October 21, 2005 - link
I would argue that if anything, it is likely that F.E.A.R. was optimized poorly, and is more likely the result. I've seen screenshots, and so far, I'm not impressed enough to put down the money. Greaphics doesn't seem to be anywhere near as good as the hype has stated (previous Half-Life 2 shots look far better IMO; perhaps I have to play it to see). Add that to the fact that there's already a 1.01 patch the day of the game release, and I think that's a symptom of a game that needs more under-the-hood work. I'll wait to see the results of testing for more games; one is not enough.
P.S. To all that said this review should have had more ATI cards, you were right on the money. This review has the Geforce 6600GT and 6800GT, and doesn't even include ATI counterparts to them (read: X800GT, X800XL)? That's poor.
Jackyl - Friday, October 21, 2005 - link
I really do think developers have either reached the limit in optimizing their code, or they are too lazy to do so. Or perhaps, it's a conspiracy between ATI/Nvidia and developers? The fact is, you shouldn't NEED a $600 video card to run some of these games coming out today. The shear lack of performance shown here on a high dollar card, shows us that something is wrong in the industry.Anyone notice a trend here in the industry? Supposedly the GPU power of the cards are increasing. X800 claims to be two times faster than an "old" 9800 Pro. Yet the game engines being written today, can't crank out more than 40fps at a measly resolution of 1280x1024? Something is wrong in the industry. As someone else said in another post...Something as got to give.
Le Québécois - Friday, October 21, 2005 - link
The problem is simple...PC game developers have no limite to speak of...They know there is allways something new coming up who will run their game perfectly...That's not the case with the console market. Since they're going to be "stuck" with the same HW for 4-5 years they HAVE to optimize their code..That why you see games on a same system ( Gamecube for exemple ) with graphic twice has beautiful as other older game running on the SAME HW...Take RE4 for exemple..nobody even though that level of graphic could be achive on a GC....but it did.
g33k - Friday, October 21, 2005 - link
I can't really complain as the 6800gt was included in the article. Good read, I enjoyed it.PrinceGaz - Friday, October 21, 2005 - link
I'd say this was a fairly good performance review except for the choice of graphics-cards.An excellent choice of current nVidia cards by including both 7800 models, and popular GF6 cards (6800GT and 6600GT) from which the performance of other 6800/6600 can be extrapolated. Given the use of a PCIe platform, the only cards I would add would be a standard 6200 (not TC) and a PCX5900; the PCX5900 would give FX5900 owners a good idea of how their card would perform and be a guide to general GF5 series performance. A 7800GTX SLI setup is also needed to show what benefit it offers, but I wouldn't bother testing anything slower in SLI as it is not a viable upgrade.
The ATI X1000 series cards included was also excellent, but only using an X800GT from the previous generation was woefully inadequate. Ideally an X850XT, X800XL, and X700Pro would also be added to give more complete information. For the generation before that, just as a PCX5900 could be used for nVidia, an X600Pro/XT could be used for ATI as that would be equivalent to a 9600Pro/XT. It's a pity there isn't a PCIe version of the 9800Pro but a 9600Pro/XT would be the next best thing. Until you can setup a Crossfire X1800XT there is no point including any Crossfire tests.
So my recommended gfx-card selection is: nVidia 7800GTX SLI, 7800GTX, 7800GT, 6800GT, 6600GT, 6200, PCX5900. ATI X1800XT, X1800XL, X1600XT, X1300Pro, X850XT, X800XL, X800GT, X700Pro, X600Pro/XT. That may seem a daunting list but it is only a total of 16 instead of 10 cards so it is not overwhelming. All the cards are PCIe so you only need the one box, and it includes a good selection of old and new cards.
The only other thing I'd change is the test system. The FX-55 processor is fine though an FX-57 would be even better; people who suggest using something slower when testing slower video-cards are missing the point of a video-card review. I would up the memory to 2GB (2x 1GB) though just to remove possible stuttering from affecting the results, even if that means slowing the timings slightly to 2-3-2.
Le Québécois - Friday, October 21, 2005 - link
Oh..and your selection of video card seems pretty good to me :P Since pple with a 9800PRO will perform closely with the X700PRO.Le Québécois - Friday, October 21, 2005 - link
The fastest CPU is good if you want to know exactly how well a GPU do in a game...but that still doesn't refelct the majority of peoples who will run the game...that's why a slower CPU could be nice. If hte idea behind this review was to show peoples how well their HW will do in this game...only using the best of the best is not the best way to achive that goal.PrinceGaz - Friday, October 21, 2005 - link
The aim of video-card reviews is to show as best as possible what the video-card is capable of when all other variables (such as CPU limitations) are removed from the equation. That's why even testing an AGP GeForce 2GTS with a high-end FX-57 processor would be preferable as the performance is determined entirely by the graphics-card.If you use slower CPUs with slower graphics-cards, it is difficult to say for sure whether it is the CPU or the graphics-card that is the limiting factor. All a review which tries to mix and match CPUs and graphics cards is saying is "this combination went this fast, but we have no idea if it was the CPU or the graphics-card that was the limiting factor, so we don't know if you should buy a faster CPU or a faster graphics-card".