Unreal Tournament 3 CPU & High End GPU Analysis: Next-Gen Gaming Explored
by Anand Lal Shimpi & Derek Wilson on October 17, 2007 3:35 AM EST- Posted in
- GPUs
AMD vs. Intel - Clock for Clock
Now it's time to tackle the touchy subject: how do AMD and Intel stack up to one another? First off, let's look at identical clock speeds to compare architectures.
At 3.0GHz, granted at a CPU-bound resolution, Intel holds a 26 - 31% performance advantage over AMD. Intel's Core 2 processors have historically done better clock for clock than AMD's K8, so it's not too much of a surprise, but an important mark in the sand.
We then cranked up the resolution to 1920 x 1200, and increased the world detail slider up to 5 to give us a more realistic situation for this clock speed comparison. The results were a bit surprising:
Despite being a mostly GPU-bound scenario, Intel still managed a 9% performance advantage over AMD at 3.0GHz. We suspect that there's something fishy going on as the test is quite GPU-bound, yet going from Intel to AMD yields a reasonable performance drop.
We looked at a 3.0GHz Athlon 64 X2 and compared it to its closest Intel price competitor, the Core 2 Duo E6550 (2.33GHz) at our high res settings:
The Intel performance advantage drops to 7% on average, but it's much larger than it should be given that we're dealing with a GPU-bound scenario. Note that difference between 2.33GHz and 3.0GHz on Intel is next to nothing, thus proving the GPU-limited case, so we're either dealing with an Unreal Engine 3 issue related to either the AMD CPUs or the nForce 590 SLI chipset/drivers we used. We've let Epic know, but for now it looks like UT3 definitely prefers Intel's Core 2, even when GPU-bound.
72 Comments
View All Comments
kmmatney - Wednesday, October 17, 2007 - link
The benchmarks show that AMD cpu's are not performing as well as they should here. This will hopefully be fixed in the future.You sound like someone who has an AMD processor and is bitter...
clairvoyant129 - Wednesday, October 17, 2007 - link
These are the same people who said there is a big difference using Netburst CPUs and K8s. Right, if a Netburst CPU coupled with a 7800GTX got 60 FPS when a K8 got 90 FPS, it was a huge difference to them but now it doesn't seem like it.hubajube - Wednesday, October 17, 2007 - link
I'm definitely not bitter, just realistic. The difference between 90 and 180 fps is totally irrelevant. An Intel E2140 gets over 90fps. Hell, a Sempron with a decent video card could play this game extremely well.
Benchmarks are great in that you can use them to judge how your system will perform with a game but they're not the be all end all of performance nor is a CPU that does 100 fps a pile of shit because it doesn't do 105 fps. And how should they be performing in your opinion? 100 fps is not good enough for you? How about 500 fps? Is that better?
JarredWalton - Wednesday, October 17, 2007 - link
The point is that at 1920x1200 we're at a completely GPU-limited resolution (as shown by the fact that the difference between E6550 and X6850 is only 1%). AMD still runs 9% slower, so it seems that architecture, cache, etc. means that even at GPU limited resolutions AMD is still slower than we would expect. Is it unplayable? No, but we're looking at the top-end AMD CPU (6400+) and in CPU-limited scenarios it's still 10% slower than an E6550.It seems to me that we're in a similar situation to what we saw at the end of the NetBurst era: higher clock speeds really aren't bringing much in the way of performance improvements. AMD needs a lot more than just CPU tweaks to close the gap, which is why we're all waiting to see how Phenom compares.
clairvoyant129 - Wednesday, October 17, 2007 - link
That 9% was using 1920x1200. Majority of PC users use a much lower resolution than that. At 1024x768, it's much much higher.Think again moron.
KAZANI - Wednesday, October 17, 2007 - link
And most people don't care about framerates higher than their monitor's refresh rate. Both processors were well above 100 frames in 1024*768.hubajube - Wednesday, October 17, 2007 - link
No moron, 1024x768 on a 8800GTX is NOT what "most PC users users" are going to be using. The video cards that "most PC users users" will be using was not tested in this benchmark. YOU need to actually THINK next time.clairvoyant129 - Wednesday, October 17, 2007 - link
Where did I say majority of PC users with an 8800GTX use 1024x768? What's your idea of testing CPUs? Benchmark them by using GPU limited resolutions? What a joke. You people never complained when Anand compared Netburst CPUs to K8s at 1024x768 or lower resolutions.Don't get your panties twisted AMD fanny.
IKeelU - Wednesday, October 17, 2007 - link
Ummm...how do you launch the flybys used in this analysis?customcoms - Wednesday, October 17, 2007 - link
You mention that you cranked the resolution to 1920x1200, but the charts still say 1024x768...the results look like those at 1920x1200 though, so I'm guessing its a typo. GPU bound CPU Comparison charts here: http://anandtech.com/video/showdoc.aspx?i=3127&...">http://anandtech.com/video/showdoc.aspx?i=3127&...