Antialiasing Performance

With midrange cards, dropping resolution a little and enabling antialiasing is usually an option. We tend to prefer a higher resolution and more settings, especially in an age where games like Oblivion and Splinter Cell: Chaos Theory require a choice between HDR and antialiasing in some cases. Hopefully we'll see fewer discrepancies in the future. For now, we've selected three of the games we tested to evaluate AA performance for our midrange group.

Battlefield 2

We see the ~140fps CPU limitation of the Core 2 Extreme X6800 having less of an impact on the X1900 XT, but the rest of the pack seems to scale similarly either with or without AA enabled. Our 6600 GT was unable to render 1920x1440 with 4xAA due to its 128MB memory size, but it isn't playable with AA at over 1024x768 anyway. While the high end of our test shows the top three cards playable at 1920x1440 with 4xAA, our 7600 GT can't be pushed past 1600x1200. The X1600 XT is stuck somewhere between 1024x768 and 1280x1024 depending on how smooth the gamer wants BF2 to run.

As with our non-AA test, the X1900 XT leads at the ~$300 price point, while the X1900 GT leads the 7900 GT in value without sacrificing performance. At the same time, the bump up from the 7600 GT in cost for an X1900 GT looks well worth it if greater than 1600x1200 resolutions are desired for Battlefield 2.

Half-Life 2: Episode One

This time the 6600 GT runs out of gas at 1280x1024 with 4xAA enabled. At the same time, every card other than the (stock) X800 GTO and X1600 XT are playable at 1600x1200 with 4xAA. This is a fairly good alternative to 1920x1440 without AA in Half-Life 2: Episode One. Having a little AA enabled does bring a little more life to the game. Since most of these midrange cards we tested can pull it off, and a good many people don't run higher than 1600x1200 anyway, this is a great option.

Quake 4

Due to the low contrast edges in most of the art and design in Quake 4, antialiasing is usually a little overkill. We'd prefer to run at a higher resolution or with uncompressed normal maps (ultra quality) rather than with AA enabled. But as Id favors OpenGL, we decided it would be beneficial to talk about antialiasing under Quake 4. Like our other tests, the 6600 GT and it's 128MB of RAM just can't handle 4xAA at 1920x1440. We might care about this if the game was at all playable at over 800x600 with 4xAA. The X1900 GT maintains its performance lead over the 7900 GT with AA enabled, but only the X1900 XT can hang on to playability at 1920x1440 with 4xAA. We do see good performance from the X1900 GT and 7900 GT at 1600x1200 though. X1600 XT users will need to stop at 1024x768 if they want to enable 4xAA with high quality settings under Quake 4.

X3: Reunion Performance Factory Overclocked 7900GT Performance
Comments Locked

74 Comments

View All Comments

  • Sharky974 - Friday, August 11, 2006 - link

    I tried comparing numbers for SCCT, FEAR and X3, the problem is Anand didn't bench any of these with AA in this mid-range test, and other sites all use 4XAA as default. So in other words no direct numbers comparison on those three games at least with those two Xbit/FS articles is possible.

    Although the settings are different, both FS and Anand showed FEAR as a tossup, though.

    It does appear other sites are confirming Anand's results more than I thought though.

    And the X1900GT for $230 is a kickass card.
  • JarredWalton - Friday, August 11, 2006 - link

    The real problem is that virtually every level of a game can offer higher/lower performance relative to the average, and you also get levels that use effects that work better on ATI or NV hardware. Some people like to make a point about providing "real world" gaming benchmarks, but the simple fact of the matter is that any benchmark is inherently different from actually sitting down and playing a game - unless you happen to be playing the exact segment benchmarked, or perhaps the extremely rare game where performance is nearly identical throughout the entire game. (I'm not even sure what an example of that would be - Pacman?)

    Stock clockspeed 7900GT cards are almost uncommon these days, since the cards are so easy to overclock. Standard clocks are actually supposed to be 450/1360 IIRC, and most cards are at least slightly overclocked in one or both areas. Throw in all the variables, plus things like whether or not antialiasing is enabled, and it becomes difficult to compare articles between any two sources. I tend to think of it as providing various snapshots of performance, as no one site can provide everything. So if we determine X1900 GT is a bit faster overall than 7900 GT and another site determines the reverse, the truth is that the cards are very similar, with some games doing better on one architecture and other games on the other arch.

    My last thought is that it's important to look at where each GPU manages to excel. If for example (and I'm just pulling numbers out of the hat rather than referring to any particular benchmarks) the 7900 GT is 20% faster in Half-Life 2 but the X1900 GT still manages frame rates of over 100 FPS, but then the X1900 GT is faster in Oblivion by 20% and frame rates are closer to 40 FPS, I would definitely wait to Oblivion figures as being more important. Especially if you run on LCDs, super high frame rates become virtually meaningless. If you can average well over 60 frames per second, I would strongly recommend enabling VSYNC on any LCD. Of course, down the road we are guaranteed to encounter games that require more GPU power, but predicting what game engine is most representative of the future requires a far better crystal ball than what we have available.

    For what it's worth, I would still personally purchase an overclocked 7900 GT over an X1900 GT for a few reasons, provided the price difference isn't more than ~$20. First, SLI is a real possibility, whereas CrossFire with an X1900 GT is not (as far as I know). Second, I simply prefer NVIDIA's drivers -- the old-style, not the new "Vista compatible" design. Third, I find that NVIDIA always seems to do a bit better on brand new games, while ATI seems to need a patch or a new driver release to address performance issues -- not always, but at least that's my general impression; I'm sure there are exceptions to this statement. ATI cards are still good, and at the current price points it's definitely hard to pick a clear winner. Plus you have stuff like the reduced prices on X1800 cards, and in another month or so we will likely have new hardware in all of the price points. It's a never ending rat race, and as always people should upgrade only when they find that the current level of performance they had is unacceptable from their perspective.
  • arturnowp - Friday, August 11, 2006 - link

    I think another advantage of 7900GT over X1900GT is power consumption. I'm not checking numbers of this matter so I am not 100% sure.
  • coldpower27 - Saturday, August 12, 2006 - link


    Yes, this is completely true, going by Xbitlab's numbers.

    Stock 7900 GT: 48W
    eVGA SC 7900 GT: 54W
    Stock X1900 GT: 75W
  • JarredWalton - Friday, August 11, 2006 - link

    Speech-recognition + lack of proofing = lots of typos

    "... out of a hat..."
    "I would definitely weight..."
    "... level of performance they have is..."

    Okay, so there were only three typos that I saw, but I was feeling anal retentive.
  • Sharky974 - Friday, August 11, 2006 - link

    Not too beat this to death, but at FS the X1900GT vs 7900GT benchmarks

    X1900GT:

    Wins-BF2, Call of Duty 2 (barely)

    Loses-Quake 4, Lock On Modern Air Combat, FEAR (barely),

    Toss ups- Oblivion (FS runs two benches, foliage/mountains, the cards split them) Far Cry w/HDR (X1900 takes two lower res benches, 7900 GT takes two higher res benches)

    At Xbit's X1900 gt vs 7900 gt conclusion


    "The Radeon X1900 GT generally provides a high enough performance in today’s games. However, it is only in 4 tests out of 19 that it enjoyed a confident victory over its market opponent and in 4 tests more equals the performance of the GeForce 7900 GT. These 8 tests are Battlefield 2, Far Cry (except in the HDR mode), Half-Life 2, TES IV: Oblivion, Splinter Cell: Chaos Theory, X3: Reunion and both 3DMarks. As you see, Half-Life 2 is the only game in the list that doesn’t use mathematics-heavy shaders. In other cases the new solution from ATI was hamstringed by its having too few texture-mapping units as we’ve repeatedly said throughout this review."

    Xbit review: http://www.xbitlabs.com/articles/video/display/pow...">http://www.xbitlabs.com/articles/video/display/pow...
  • Geraldo8022 - Thursday, August 10, 2006 - link

    I wish you would do a similar article concerning the video cards for HDTV and HDCP. It is very confusing. Even though certain crds might state they are HDCP, it is not enabled.
  • tjpark1111 - Thursday, August 10, 2006 - link

    the X1800XT is only $200 shipped, why not include that card? if the X1900GT outperforms it, then ignore my comment(been out of the game for a while)
  • LumbergTech - Thursday, August 10, 2006 - link

    so you want to test the cheaper gpu's for those who dont want to spend quite as much..ok..well why are you using the cpu you chose then? that isnt exactly in the affordable segement for the average pc user at this point
  • PrinceGaz - Thursday, August 10, 2006 - link

    Did you even bother reading the article, or did you just skim through it and look at the graphs and conclusion? May I suggest you read page 3 of the review, or in case that is too much trouble, read the relevant excerpt-

    quote:

    With the recent launch of Intel's Core 2 Duo, affordable CPU power isn't much of an object. While the midrange GPUs we will be testing will more than likely be paired with a midrange CPU, we will be testing with high end hardware. Yes, this is a point of much contention, as has always been the case. The arguments on both sides of the aisle have valid points, and there are places for system level reviews and component level reviews. The major factor is that the reviewer and readers must be very careful to understand what the tests are really testing and what the numbers mean.

    For this article, one of the major goals is to determine which midrange cards offers the best quality and performance for the money at stock clock speeds at this point in time. If we test with a well aged 2.8GHz Netburst era Celeron CPU, much of our testing would show every card performing the same until games got very graphics limited. Of course, it would be nice to know how a graphics card would perform in a common midrange PC, but this doesn't always help us get to the bottom of the value of a card.

    For instance, if we are faced with 2 midrange graphics cards which cost the same and perform nearly the same on a midrange CPU, does it really matter which one we recommend? In our minds, it absolutely does matter. Value doesn't end with what performance the average person will get from the card when they plug it into a system. What if the user wants to upgrade to a faster CPU before the next GPU upgrade? What about reselling the card when it's time to buy something faster? We feel that it is necessary to test with high end platforms in order to offer the most complete analysis of which graphics solutions are actually the best in their class. As this is our goal, our test system reflects the latest in high end performance.

Log in

Don't have an account? Sign up now