Unreal Tournament 3 Beta Demo: Top to Bottom GPU Analysis
by Derek Wilson on October 18, 2007 4:00 AM EST- Posted in
- GPUs
Bringing it all Together
With our look at the low end and mainstream markets wrapped up, it's time to take a look at how everything stacks up against everything else. Yes, it is difficult to really see what's going on here, but that's why we broke our GPU coverage down into three bite sized parts. We can pick a monitor size and look at how much graphics power we need to get the performance we want, or just see how much of an advantage one class of cards has over another. As we've already looked at 1024x768, 1280x1024, and 1920x1200 (in our previous article), here we will break out 1600x1200 and then look at scaling.
Looking at this graph, the AMD performance advantages are certainly clear. Bringing it all home is our scaling graph:
click to enlarge
There isn't much to be said here that hasn't already been covered. We do see more instances of cards scaling differently in this test than usual, and of course, there is the fact that AMD performance is quite good.
Final Words?
Well, these can't really be the final words on Unreal Tournament 3 performance, as running a flyby in the beta version of a demo for the game is more of a preview of what might come to pass. Epic still has time to refine their software, and AMD and (especially) NVIDIA will be working on prepping their drivers for the launch of the game as well.
While these tests did a good job of reproducing the numbers we saw when running around with FRAPS, a full multiplayer timedemo with character models, effects, and high resolution textures (which are not included in this beta demo) could really change what we see here. When we are able to get access to the demoplay feature of UT3, we will certainly revisit our tests with a much more brutal work load.
Hopefully this look at GPU performance under UT3 has been insightful, but our testing really leaves us with more questions than answers. Will AMD remain on top when the game launches? How will final game performance compare to this beta preview? Will the demo playback functionality change the playing field? Stay tuned, and we will answer these questions as soon as we can.
With our look at the low end and mainstream markets wrapped up, it's time to take a look at how everything stacks up against everything else. Yes, it is difficult to really see what's going on here, but that's why we broke our GPU coverage down into three bite sized parts. We can pick a monitor size and look at how much graphics power we need to get the performance we want, or just see how much of an advantage one class of cards has over another. As we've already looked at 1024x768, 1280x1024, and 1920x1200 (in our previous article), here we will break out 1600x1200 and then look at scaling.
Looking at this graph, the AMD performance advantages are certainly clear. Bringing it all home is our scaling graph:
click to enlarge
There isn't much to be said here that hasn't already been covered. We do see more instances of cards scaling differently in this test than usual, and of course, there is the fact that AMD performance is quite good.
Final Words?
Well, these can't really be the final words on Unreal Tournament 3 performance, as running a flyby in the beta version of a demo for the game is more of a preview of what might come to pass. Epic still has time to refine their software, and AMD and (especially) NVIDIA will be working on prepping their drivers for the launch of the game as well.
While these tests did a good job of reproducing the numbers we saw when running around with FRAPS, a full multiplayer timedemo with character models, effects, and high resolution textures (which are not included in this beta demo) could really change what we see here. When we are able to get access to the demoplay feature of UT3, we will certainly revisit our tests with a much more brutal work load.
Hopefully this look at GPU performance under UT3 has been insightful, but our testing really leaves us with more questions than answers. Will AMD remain on top when the game launches? How will final game performance compare to this beta preview? Will the demo playback functionality change the playing field? Stay tuned, and we will answer these questions as soon as we can.
34 Comments
View All Comments
Ecmaster76 - Thursday, October 18, 2007 - link
I bet all those people who bought x1k cards are feeling pretty good right now. Once again, the radeon has shown in the long haul its superior longevity compared to the Geforce (assuming that future UT3 verions and drivers dont change the results significantly.legoman666 - Thursday, October 18, 2007 - link
I'm still running with my x1800xt. The problem is, I never see benchmarks for it for new games. I can't really compare it to the x1950xt either, since they're different cores. Is there any way future reviews (or maybe this reviewcould be updated?) could have the x1800xt benchmarks included?That being said, I can run all of the Orange Box games at 1280x1024, 4xAA, 8xAF, all max details with vsync on and still get 38fps (75hz/2). As long as UT3 isn't much more demanding than the source engine, I will probably be fine. Now that I typed that, I remember that Bioshock uses the UT3 engine. Bioshock also runs great on my machine with all the max details.
I guess a good thing about having a compariatively small monitor (1280x1024 instead of one the larger wide screens) is that I still get decent frame rates since the newer monitors are designed for those huge screens and I'm still using my "tiny" screen. hopefully my monitor is the next thing that gets upgraded.
Spoelie - Friday, October 19, 2007 - link
x1800xt is somewhat comparable to the x1950proit has less shading power but more pixel pushing power, so in shader heavy games like this its general performance will be slightly less than the x1950pro, but it will cope better with stuff like anti-aliasing, upping the resolution & anisotropic filtering.
johnsonx - Thursday, October 18, 2007 - link
I'm personally not too sure about my 1950Pro AGP. I don't seem to be getting such great performance.My system specs out far better than my son's (me=X2@2.5Ghz, 1GB, X1950Pro AGP, Vista) (son=A64-3500, 2GB, 7900GS PCIe, XP Pro), yet he appears to get better performance in UT3. I haven't benchmarked it, but he has all detail levels turned up to max while I run mine with the details one tick above minimum, yet his seems smoother than mine.
Between my slightly faster dual core vs. his single core, and my more powerful video card, I ought to be able to run max detail (we both run 1280x1024 LCD's, which should be a walk in the park for my rig).
His system has only one thing better than mine, which is he has 2GB of ram while I have 1GB... but I haven't noted any swapping, and the game still loads pretty fast so it doesn't seem memory constrained.
I know there are many variables here (Vista vs XP, 1Gb vs 2Gb, AGP vs PCIe), but none of those AFAIK should make all that much difference today (obviously the Vista vs XP thing was a big deal 6 months ago, but the drivers have largely reached performance parity haven't they?). I guess I need to figure out how to run the benchmarks Derek did and see what's what.
Spoelie - Friday, October 19, 2007 - link
it's not the fact that you have 1gig or the fact you have vista, but the combination of those 2 make it really a sub-par gaming machine. You really ought to double the ram if you want to game in vista, and even then the same config will get a bit better performance if it was running xp.Also "seems choppy": do you have an lcd screen? v-sync on then.
mcnabney - Thursday, October 18, 2007 - link
You are running Vista, your son is running XP. Vista cripples gaming performance across the board.ChronoReverse - Thursday, October 18, 2007 - link
Yeah, I have an x1950 and I'm feeling pretty plucky indeed =DMrKaz - Thursday, October 18, 2007 - link
Why do you put the 2600XT in the same bag of the 8600GTS.The price difference is huge.
I can buy one good 2600XT for 100€ and one good 8600GTS for 190€.
The more correct comparison is (I think)
19x0XT = 8600GTS
2600XT = 8600GT
2600PRO = 8500GT
2400PRO/XT = 8400GS
dm0r - Thursday, October 18, 2007 - link
I compared in performance, not price...anyway looks like 2600xt is getting mature with new drivers.I would like to see power consumption tests please
cmdrdredd - Thursday, October 18, 2007 - link
power consumption is covered elsewhere. Game performance reviews/previews/guides are for PERFORMANCE based comparisons.