ATI Radeon HD 2900 XT: Calling a Spade a Spade
by Derek Wilson on May 14, 2007 12:04 PM EST- Posted in
- GPUs
General Image Quality
Beyond antialiasing, there are quite a number of factors that go into making real-time 3D look good. Real-time graphics are an optimization problem, and the balance between performance and quality is very important. There is no single "right" way to do graphics, and AMD and NVIDIA must listen carefully to developers and consumers to deliver what they believe is the sweet spot between doing things fast and doing things accurately.
NVIDIA currently offers much more customizable image quality. Users are able to turn on and off different optimizations as they see fit. AMD really only offers a couple specific settings that affect image quality, while most of their optimizations are handled on a per game basis by the ominous feature known as Catalyst A.I. The options we have are disabled, standard and advanced. This doesn't really tell us what is going on behind the scenes, but we leave this setting on standard for all of our tests, as this is the default setting and most users will leave it alone.
Aside from optimizations, texture filtering plays a large role in image quality when high levels of filtering are called for. It's trivial to point sample or bilinear filter, and no one skimps on these duties, but when we get to trilinear and anisotropic filtering the number of texture samples we need and the number of calculations we must perform per pixel go up very quickly. In order to mitigate the cost of these operations, both AMD and NVIDIA attempt to apply high levels of filtering where they are needed and not-so-high levels of filtering where it won't matter that much. Of course there is much debate over where to draw the lines here, and NVIDIA and AMD both choose different paths.
To investigate texture filtering quality, we have employed the trusty D3D AF-Tester. This long-lived application enables us to look at one texture with different colored mipmap levels to see how hardware handles filtering them under different settings. Thankfully, we don't have to talk about angle dependent anisotropic filtering (which is actually a contradiction in terms anyway). AMD and NVIDIA both finally do good quality anisotropic filtering that results in higher resolutions textures being used more of the time where possible. Take a look at these images to see how the different hardware stacks up.
NVIDIA G80 Tunnel 8x/16x AF
G80
R5xx
R6xx
It still looks like NVIDIA is doing slightly more angle independence filtering. In practice, it will be very difficult to tell the difference between an image rendered on AMD hardware and one rendered on NVIDIA hardware. We can also see that AMD has slightly tweaked their AF technique to eliminate some of the odd transitions we noticed on R5xx hardware. This comes through a little better if we look at a flat plane:
AMD R600 Plane 8x AF
R5xx
R6xx
We did happen to notice at least one image quality issue not related to texture filtering on AMD hardware. The problem turns up in Rainbow Six: Vegas in the form of very bad banding where we should see HDR lighting. We didn't notice this problem on G80, as we can see from our comparison.
Click to enlarge |
We also noticed a small issue with Oblivion at one point where the oblivion gate shader would bleed through other objects, but this was not reproducible and we couldn't get a screenshot of it. This means it could be a game related issue rather than a hardware or driver problem. We'll keep our eyes peeled.
Overall IQ of the current DX10 hardware available is quite good, but we will continue to dig further into the matter to make sure that everything stays that way. We're also waiting for DX10 games before we can determine if there are other differences, but hopefully that won't be the case as DX10 has a single set of requirements.
86 Comments
View All Comments
imaheadcase - Tuesday, May 15, 2007 - link
Says who? Most people I know don't care to turn on AA since they visually can't see a difference. Only people who are picky about everything they see do normally, the majority of people don't notice "jaggies" since the brain fixes it for you when you play.
Roy2001 - Tuesday, May 15, 2007 - link
Says who? Most people I know don't care to turn on AA since they visually can't see a difference.------------------------------------------
Wow, I never turn it of once I am used to have AA. I cannot play games anymore without AA.
Amuro - Tuesday, May 15, 2007 - link
Says who? No one spent $400 on a video card would turn off AA.
SiliconDoc - Wednesday, July 8, 2009 - link
Boy we'd sure love to hear those red fans claiming they turn off AA nowadays and it doesn't matter.LOL
It's just amazing how thick it gets.
imaheadcase - Tuesday, May 15, 2007 - link
Sure they do, because its a small "tweak" with a performance hit. I say who spends $400 on a video card to remove "jaggies" when they are not noticeable in the first place to most people. Same reason most people don't go for SLI or Crossfire, because it really in the end offers nothing substantial for most people who play games.
Some might like it, but they would not miss it if they stopped using it for some time. Its not like its make or break feature of a video card.
motiv8 - Tuesday, May 15, 2007 - link
Depends on the game or player tbh.I play within ladders without AA turned on, but for games like oblivion I would use AA. Depends on your needs at the time.