SimCity 4 Performance no AA/AF

We drop off about a third of a frame per second which is negligible at 74 fps. The low is 5fps higher in the benchmark though, so that's the main benefit we see in this title.

SimCity 4 Performance 4xAA/8xAF

This time we get a higher average fps by about two and a quarter fps. This isn't very big when talking about 71fps, but its still something.

Neverwinter Nights: Shadow of Undrendtide Performance Splinter Cell Performance
Comments Locked

62 Comments

View All Comments

  • sandorski - Wednesday, October 8, 2003 - link

    #3 has the best comment, is there a way to get the info of the Core/Mem speed being used? It just seems that outside of the OpenGL benches, the increases were too small to really know if Overclocking was happening or not. IOW, the gains were small enough to fall into statistical anomaly.

    Certainly any gain is good, but I was able to acheive better improvements from overclocking on a number of video cards, either ATI is being really conservative(which makes sense since they have Warranty concerns) or the Overdrive isn't working(possible, but I'd choose conservatism as the reason).

    I'd also be interested in the results of using exotic cooling(not that I'd use it). It would be especially interesting to see if the Overdrive feature is able to recognize Graphic Corruption or whether it is solely concerned with temperature.
  • Anonymous User - Wednesday, October 8, 2003 - link

    The XT is how the original 9800Pro should have been released 6 months ago
  • Anonymous User - Wednesday, October 8, 2003 - link

    ZERO DX9 games???

    Then please explain how it is possible that I have a DX9 game on my machine at the moment. (A TWIMTBP even :-)

    But if the rare DX9 games at the moment are not the ones you are interested in, then indeed it is better to wait. (if your card is still fast enough for the current DX8 games).
    But if you do need to buy a new card at this moment, (maybe the old one broke) then the choice is very clear.
  • Anonymous User - Wednesday, October 8, 2003 - link

    the conclusion didn't say ati was a bad card to buy, it just said its a bad time to buy a card.
  • Anonymous User - Wednesday, October 8, 2003 - link

    You're an idiot #8. HL2 won't be out for months and there are basically ZERO DX9 games out. Therefore, it'd be stupid to buy a card now based on basically no data. And no, ShaderMark 2.0 is barely hard data.
  • Anonymous User - Wednesday, October 8, 2003 - link

    Lol #4. ;)

    I am extremely perplexed myself with the conclusion. Specifically, the last sentence:

    "Even so, we are still standing behind our wait and see recommendation with respect to purchasing a card intended for use with the coming DX9 games."

    Hello? Anandtech? If you are buying these cards with an eye to DX9 performance, the choice is clear. ATI. At best, nVidia "optimizations" might get DX9 performance on par with ATI's. At worst, it's about 50% of ATI performance.

    It's only if you DON;'T care about DX9 performance that the FX line even becomes an option.

    Your conclusion is completley, 100% backwards.
  • DerekWilson - Wednesday, October 8, 2003 - link

    I'd love nothing more than to strap a water cooling solution on this bad boy and watch it fly! Don't know if anyone's gonna let me tear stuff apart yet though ;-)
  • Anonymous User - Wednesday, October 8, 2003 - link

    Derek, Thanks for holding off on the buying advice
  • Anonymous User - Wednesday, October 8, 2003 - link

    You should do another comparison that puts some extra cooling on the card, either by replacing the stock heatsink/fan or simply blowing more air across the card as if someone set a card cooler next to the video card. If the drivers worked well, this should noticeably improve card performance in a fairly inexpensive way.
  • Anonymous User - Wednesday, October 8, 2003 - link

    Good benchmark.. but you know, you've probably struck a nerve with all the ATi fanboys with that conclusion ;)

    And yes, I own a Radeon. Hate of fanboys is (or should be) universal.

Log in

Don't have an account? Sign up now