ATI Radeon HD 4890 vs. NVIDIA GeForce GTX 275
by Anand Lal Shimpi & Derek Wilson on April 2, 2009 12:00 AM EST- Posted in
- GPUs
Final Words
NVIDIA is competitive at this new price point of $250 depending on what resolution you look at. We also see some improvement from NVIDIA's new 185 series driver and get a new feature to play with in the form of Ambient Occlusion. We did look at PhysX and CUDA again, and, while we may be interested in what is made possible by them, there is still a stark lack of compelling content that takes advantage of these technologies. We can't recommend prioritizing PhysX and CUDA over performance, and performance is where a GPU needs to compete. Luckily for NVIDIA, the GTX 275 does.
The fact that its worst-case performance is still better than the GTX 260 core 216 and in the best case, it can hit that of the GTX 280 was a plus for the GTX 275. It often posted performance more in line with its bigger brothers than a $50+ cheaper part. This is pretty sweet for a $250 card, especially as many games these days rely very heavily on shader performance. The GeForce GTX 275 is a good fit for this price point, and is a good option. But then there's the Radeon HD 4890.
The 4890, basically a tweaked and overclocked 4870, does improve performance over the 4870 1GB and puts up good competition for the GTX 275. On a pure performance level the 4890 and GTX 275 trade blows at different resolutions. The 4890 tends to look better at lower resolutions while the GTX 275 is more competitive at high resolutions. At 1680 x 1050 and 1920 x 1200 the 4890 is nearly undefeated. At 2560 x 1600, it seems to be pretty much a wash between the two cards.
At the same time, there are other questions, like that of availability. With these parts performing so similarly, and price being pretty well equal, the fact that AMD parts can be bought starting today and we have to wait for the NVIDIA parts is an advantage for AMD. However, we have to factor in the fact that AMD driver support doesn't have the best track record as of late for new game titles. Add in the fact that NVIDIA's developer relations seem more effective than AMD's could mean more titles that run better on NVIDIA hardware in the future. So what to go with? Really it depends on what resolutions you're targeting and what the prices end up being. If you've got a 30" display then either card will work, it's just up to your preference and the items we talked about earlier. If you've got a 24" or smaller display (1920x1200 or below), then the Radeon HD 4890 is the card for you.
AMD tells us that most retailers will feature mail in rebates of $20, a program which was apparently underwritten by AMD. Could AMD have worried they weren't coming in at high enough performance late in the game and decided to try and throw an extra incentive in there? Either way, not everyone likes a mail in rebate. I much prefer the instant variety and mail-in-rebate offers do not make decisions for me. We still compare products based on their MSRP (which is likely the price they'll be back at once the rebate goes away). This is true for both AMD and NVIDIA parts.
There will also be overclocked variants of the GTX 275 to compete with the overclocked variants from AMD. The overclock on the AMD hardware is fairly modest, but does make a difference and the same holds true for the GTX 275 products in early testing. We'll have to take a look at how such parts compare in the future along with SLI and CrossFire. In the meantime, we have another interesting battle at the $250 price point.
294 Comments
View All Comments
7Enigma - Thursday, April 2, 2009 - link
And just go and disregard everything I typed (minus the different driver versions). Xbit apparently underclocked the 4890 to stock speeds. So I have no clue how the heck their numbers are so significantly different, except they have this posted on system settings:ATI Catalyst:
Smoothvision HD: Anti-Aliasing: Use application settings/Box Filter
Catalyst A.I.: Standard
Mipmap Detail Level: High Quality
Wait for vertical refresh: Always Off
Enable Adaptive Anti-Aliasing: On/Quality
Other settings: default
Nvidia GeForce:
Texture filtering – Quality: High quality
Texture filtering – Trilinear optimization: Off
Texture filtering – Anisotropic sample optimization: Off
Vertical sync: Force off
Antialiasing - Gamma correction: On
Antialiasing - Transparency: Multisampling
Multi-display mixed-GPU acceleration: Multiple display performance mode
Set PhysX GPU acceleration: Enabled
Other settings: default
If those are set differently in Anand's review I'm sure you could get some weird results.
SiliconDoc - Monday, April 6, 2009 - link
LOL - set PhysX gpu accelleration enabled.roflmao
Yeah man, I'm gonna get me that red card... ( if you didn't detect sarcasm, forget it)
tamalero - Thursday, April 9, 2009 - link
good to know you blame everyone for "bad reading understanding"let's see
ATI Catalyst:
Smoothvision HD: Anti-Aliasing: Use application settings/Box Filter
Catalyst A.I.: Standard
Mipmap Detail Level: High Quality
Wait for vertical refresh: Always Off
Enable Adaptive Anti-Aliasing: On/Quality
Other settings: default
Nvidia GeForce:
Texture filtering – Quality: High quality
Texture filtering – Trilinear optimization: Off
you see the big "NVIDIA GEFORCE:" right below "other settings"?
that means the physX was ENABLED on the GEFORCE CARD.
you sir, are a nvidia fanboy and a big douché
SiliconDoc - Thursday, April 23, 2009 - link
More personal attacks, when YOU are the one who can't read, you IDIOT.Here are my first two lines: LOL - set PhysX gpu accelleration enabled.
roflmao
_____
Then you tell me it says PhySx is enabled - which is what I pointed out. You probably did not go see the linked test results at the other site, and put two and two together.
Look in the mirror and see who can't read, YOU FOOL.
Better luck next time crowing barnyard animal.
"Cluckle red 'el doo ! Cluckle red 'ell doo !"
Let's see, I say PhySx is enabled, and you scream at me to point out it says PhysX is enabled, and call me an nvidia fan because of it - which would make you an nvidia fan as well - according to you, IF you knew what the heck you were doing, which YOU DON'T.
That makes you - likely a red rooster... I may check on that - hopefully you're not a noob poster, too, as that would reduce my probabilities in the discovery phase. Good luck, you'll likely need it after what I've seen so far.
7Enigma - Thursday, April 2, 2009 - link
Looked even closer and the drivers used were different.ATI Drivers:
Anand-9.4 beta
Xbit-9.3
Nvidia:
Anand-185
Xbit-182.08
ancient46 - Thursday, April 2, 2009 - link
I don't see the fun in shooting cloth and unrealistic non impact resistant windows in high rise buildings. The video with the cloth was distracting, it made me wonder why it was there. What was its purpose? My senior eyes did not see much of an improvement in the videos in the CUDA application.SiliconDoc - Monday, April 6, 2009 - link
Maybe someday you'll lose you're raging red fanboy bias, brakdown entirely, toss out your life religion, and buy an nvidia card. At that point perhaps Mirror's Edge will come with it, and after digging it out of the trash can (second thoughts you had), you'll try it, and like anand, really like it - turn it off, notice what you've been missing, turn it back on, and enjoy. Then after all that, you can crow "meh".I suppose after that you can revert to red rooster raging fanboy - you'll have to have your best red bud rip you from your Mirror's Edge addiction, but that's ok, he's a red and will probably smack you for trying it out - and have a clean shot with ow absorbed you'll be.
Well, that should rap it up.
poohbear - Thursday, April 2, 2009 - link
are the driver issues for AMD that significant that it needs to be mentioned in a review article? im asking in all honesty as i dont know. Also, this close developer relationship nvidia has w/ developers. does that show up in any games to significantly give a performance edge for nvidia vid cards? is there an example game out there for this? thanks.7Enigma - Thursday, April 2, 2009 - link
Look no further than this article. :) Here's the quote:"The first thing about Warmonger is that it runs horribly slow on ATI hardware, even with GPU accelerated PhysX disabled. I’m guessing ATI’s developer relations team hasn’t done much to optimize the shaders for Radeon HD hardware. Go figure."
But ATI also has some relations with developers that show an unusually high advantage as well (Race Driver G.R.I.D. for example). All in all, as long as no one is cheating by disabling effects or screwing with draw distances, it only benefits the consumer for the games to be optimized. The more one side pushes for optimizations, the more the other side is forced, or risk losing the benchmark wars (which ultimately decides purchases for most people).
SkullOne - Thursday, April 2, 2009 - link
In the conclusion mentions Nvidia's partners releasing OC boards but nothing about AMD. There is already two versions of the XFX HD4890 on Newegg. One is 850 core and the other is 875 core.The HD4890 is geared to open that SKU of "OC" cards for AMD. People with stock cooling and stock voltage can already push the card to 950+MHz. On the ASUS card you boost voltage to the GPU which has allowed people to get over 1GHz on their GPU. As the card matures seeing 1GHz cores on stock cooling and voltage will become a reality.
It seems like these facts are being ignored.