ATI Radeon HD 4890 vs. NVIDIA GeForce GTX 275
by Anand Lal Shimpi & Derek Wilson on April 2, 2009 12:00 AM EST- Posted in
- GPUs
Mirror’s Edge: Do we have a winner?
And now we get to the final test. Something truly different: Mirror’s Edge.
This is an EA game. Ben had to leave before we got to this part of the test, he does have a wife and kid after all, so I went at this one alone.
I’d never played Mirror’s Edge. I’d seen the videos, it looked interesting. You play as a girl, Faith, a runner. You run across rooftops, through buildings, it’s all very parkour-like. You’re often being pursued by “blues”, police offers, as you run through the game. I won’t give away any plot details here but this game, I liked.
The GPU accelerated PhysX impacted things like how glass shatters and the presence of destructible cloth. We posted a video of what the game looks like with NVIDIA GPU accelerated PhysX enabled late last year:
"Here is the side by side video showing better what DICE has added to Mirror's Edge for the PC with PhysX. Please note that the makers of the video (not us) slowed down the game during some effects to better show them off. The slow downs are not performance related issues. Also, the video is best viewed in full screen mode (the button in the bottom right corner)."
In Derek’s blog about the game he said the following:
“We still want to really get our hands on the game to see if it feels worth it, but from this video, we can at least say that there is more positive visual impact in Mirror's Edge than any major title that has used PhysX to date. NVIDIA is really trying to get developers to build something compelling out of PhysX, and Mirror's Edge has potential. We are anxious to see if the follow through is there.”
Well, we have had our hands on the game and I’ve played it quite a bit. I started with PhysX enabled. I was looking for the SSD-effect. I wanted to play with it on then take it away and see if I missed it. I played through the first couple of chapters with PhysX enabled, fell in lust with the game and then turned off PhysX.
I missed it.
I actually missed it. What did it for me was the way the glass shattered. When I was being pursued by blues and they were firing at me as I ran through a hallway full of windows, the hardware accelerated PhysX version was more believable. I felt more like I was in a movie than in a video game. Don’t get me wrong, it wasn’t hyper realistic, but the effect was noticeable.
I replayed a couple of chapters and then played some new ones with PhysX disabled now before turning it back on and repeating the test.
The impact of GPU accelerated PhysX was noticeable. EA had done it right.
The Verdict?
So am I sold? Would I gladly choose a slower NVIDIA part because of PhysX support? Of course not.
The reason why I enjoyed GPU accelerated PhysX in Mirror’s Edge was because it’s a good game to begin with. The implementation is subtle, but it augments an already visually interesting title. It makes the gameplay experience slightly more engrossing.
It’s a nice bonus if I already own a NVIDIA GPU, it’s not a reason for buying one.
The fact of the matter is that Mirror’s Edge should be the bare minimum requirement for GPU accelerated PhysX in games. The game has to be good to begin with and the effects should be the cherry on top. Crappy titles and gimmicky physics aren’t going to convince anyone. Aggressive marketing on top of that is merely going to push people like us to call GPU accelerated PhysX out for what it is. I can’t even call the overall implementations I’ve seen in games half baked, the oven isn’t even preheated yet. Mirror’s Edge so far is an outlier. You can pick a string of cheese off of a casserole and like it, but without some serious time in the oven it’s not going to be a good meal.
Then there’s the OpenCL argument. NVIDIA won’t port PhysX to OpenCL, at least not anytime soon. But Havok is being ported to OpenCL, that means by the end of this year all games that use OpenCL Havok can use GPU accelerated physics on any OpenCL compliant video card (NVIDIA, ATI and Intel when Larrabee comes out).
While I do believe that NVIDIA and EA were on to something with the implementation of PhysX in Mirror’s Edge, I do not believe NVIDIA is strong enough to drive the entire market on its own. Cross platform APIs like OpenCL will be the future of GPU accelerated physics, they have to be, simply because NVIDIA isn’t the only game in town. The majority of PhysX titles aren’t accelerated on NVIDIA GPUs, I would suspect that it won’t take too long for OpenCL accelerated Havok titles to equal that number once it’s ready.
Until we get a standard for GPU accelerated physics that all GPU vendors can use or until NVIDIA can somehow convince every major game developer to include compelling features that will only be accelerated on NVIDIA hardware, hardware PhysX will be nothing more than fancy lettering on a cake.
You wanted us to look at PhysX in a review of an ATI GPU, and there you have it.
294 Comments
View All Comments
SiliconDoc - Monday, April 6, 2009 - link
Don't worry, it is mentioned in the article their overclocking didn't have good results, so they're keying up a big fat red party for you soon.They wouldn't dare waste the opportunity to crow and strut around.
This was about announcing the red card, slamming nvidia for late to market, and denouncing cuda and physx, and making an embarrassingly numberous amount of "corrections" to the article, including declaring the 2560 win, not a win anymore, since the red card didn't do it.
That's ok, be ready for the change back to 2560 is THE BESt and wins, when the overclock review comes out.
:)
Don't worry be happy.
tamalero - Thursday, April 9, 2009 - link
SD, you seriously have a mental problem right?I noticed that you keep bashing, being sarcastically insultive (betwen other things.) to anyone who supports ati.
SiliconDoc - Thursday, April 23, 2009 - link
No, not true at all, there are quite a few posts where the person declaring their ATI fealty doesn't lie their buttinski off - and those posts I don't counter.Sorry, you must be a raging goofball too who can't spot liars.
It's called LOGIC, that's what you use against the lairs - you know, scientific accuracy.
Better luck next time - If you call me wrong I'll post a half dozen red rooster rooters in this thread that don't lie in what they say and you'll see I didn't respond.
Now, you can apologize any time, and I'll give you another chance, since you were wrong this time.
Nfarce - Thursday, April 2, 2009 - link
I just finished a mid-range C2D build, and decided to go with the HD 4870 512MB version for $164.99 (ASUS, no sale at NE, but back up to $190 now). This was my first ATI card and it was a no-brainer. While the 4890 is a better card, to me, it is not worth the nearly $100 more, especially considering I'm gaming at either 1920x1200 on a 40" LCD TV or a 22" LCD monitor at 1680x1050.Nvidia has lost me after 12 years as a fanboy for the time being, I suppose. What I will do here when I have more time is determine if buying another 4870 512MB for CrossFire will be the better bang for my resolutions or eventually moving up to the 4890 when the price drops this summer and then sell the 4870.
Thanks for the GREAT review AT, and now I have my homework cut out for me for comparisons with your earlier GPU reviews.
Jamahl - Thursday, April 2, 2009 - link
Good job with tidying up the conclusion Anand.Russ2650 - Thursday, April 2, 2009 - link
I've read that the 4890 has 959M transistors, 3M more than the 4870.Gary Key - Thursday, April 2, 2009 - link
That is correct and is discussed on page 3. The increase in die size is due to power delivery improvements to handle the increased clock speeds.Warren21 - Thursday, April 2, 2009 - link
Maybe the tables should be updated to reflect this?Gary Key - Thursday, April 2, 2009 - link
They are... :)helpmespock - Thursday, April 2, 2009 - link
I've been sitting on my 8800GT for a while now and was thinking about going to a 4870 1GB model, but now I may hold off and see what prices do.If the 4890/275 force the 4870 down in price then great I'll go with that, but on the other hand if prices slip from the new parts off of the $250 mark then I'll be tempted by that instead.
Either way I think I'm waiting to see how the market shakes out and in the end I, the consumer, will win.