The Clarkdale Review: Intel's Core i5 661, i3 540 & i3 530
by Anand Lal Shimpi on January 4, 2010 12:00 AM EST- Posted in
- CPUs
Intel HD Graphics: A Lot Better
With Clarkdale Intel has finally dropped the Graphics Media Accelerator (GMA) prefix. Intel integrated graphics is now just called Intel HD Graphics.
The move to 45nm gave Intel the ability to beef up its graphics core a bit, but ultimately it’s the same architecture as the G45 - just faster. We won’t see Larrabee on a CPU for some years to come.
The GMA X4500 core from G45 had 10 shader processors. Intel HD Graphics bumps that up to 12. Apparently there are a number of internal tweaks and performance enhancements that should result in more than a 20% increase in performance though.
All integrated GPUs regardless of vendor, pretty much suck. Intel gets a bum rap because while other IGPs may offer 30+ fps in games at the lowest quality settings, Intel can often only manage single digit frame rates. It doesn’t take too much searching to prove that one.
Clarkdale does change that a bit. Intel has finally delivered an integrated graphics solution that is at least competitive with existing IGPs on the market. To show you how far it’s come I’ve pitted our Clarkdale based Core i5 661 against an AMD Phenom II X4 965 with 790GX graphics. Our 790GX platform had 128MB of on-board memory to drive performance even higher. If Intel can manage a win here, it'll be a convincing one.
I’ve also tossed in a G45 board for good measure. The only absent member is NVIDIA's GeForce 9400. We found in our 790GX review that AMD delivered roughly the same graphics performance (if not better) as the 9400 so any advantage/disadvantage here would apply to NVIDIA as well.
We’ll start off with Batman: Arkham Asylum. This is an Unreal Engine 3 based game. The first thing you need to remember about integrated graphics is that regardless of the game, you’ll want to go in and turn down every single quality setting at your disposal. In this case I ran Batman at 1024 x 768 with all quality options set to low.
That's Batman running on Intel HD Graphics
Batman doesn’t look half bad at the playable IGP settings. It’s surprising. Tim Sweeney once told me that a good looking game is half engine, half art. It looks like Batman just has that right combination of engine and art to make it look decent even on Intel’s integrated graphics. It’s not great by any means, but it’s not pixelated mush.
The performance is also halfway respectable. Intel’s fastest integrated graphics manages to tie AMD’s 790GX IGP at 35 fps. It’s also over twice as fast as G45. Even the lower end Core i3 CPUs should manage close to 30 fps here.
Next up is Dragon Age. Unfortunately, this game doesn’t look as good at its playable integrated graphics settings. It ends up looking like 3D Kings Quest played on a PS2.
Giggle.
Performance is respectable from the new Intel HD graphics. At 41.5 fps it’s actually faster than AMD’s 790GX chipset. Definitely more than twice the speed of the old G45. Keep in mind that we’re looking at the highest end IGP from Intel. The Core i3s will be appreciably slower, most likely at or below the performance of the 790GX.
Dawn of War II on Intel HD Graphics
Dawn of War II looks and plays like crap on Intel’s integrated graphics. Averaging 15 fps on the fastest Clarkdale, the minimum frame rates dropped as low as 3.4 fps. This is a huge improvement over G45, but definitely not what I would consider playable.
Intel is technically the leader here though. AMD’s 790GX only managed 12.1 fps. IGPs need not apply for this title at present.
Call of Duty Modern Warfare 2 is the first time that we see Intel losing. The game loses much of its visual appeal at the settings you need to run at in order to be playable on integrated graphics, especially on a larger screen.
AMD’s 790GX is around 40% faster than Intel’s HD Graphics. Blech.
World of Warcraft is a very important title to perform well under and unfortunately Intel loses this one to AMD. However, we are running at the "Good" setting as opposed to bare minimum detail settings:
Our final integrated graphics game benchmark is HAWX.
At 53 fps Intel falls behind the 790GX but it’s around 2x the speed of G45 and high enough that I’d consider it playable (albeit at the lowest possible settings).
Intel has taken a significant step forward with its integrated graphics. It's at the point where I'd say it's finally competitive with the best from AMD or NVIDIA. Intel has delivered on its promise to take integrated graphics more seriously, and I hope we will see even bigger performance gains with Sandy Bridge.
Intel took a big step forward to the point where it is no longer the laughing stock of the graphics industry. But it stepped into a position of mediocrity, joining AMD and NVIDIA. Integrated graphics has never been good regardless of the manufacturer. We honestly need to be at around 2x existing performance to deliver a reasonable gaming experience on integrated graphics. AMD is going to be shipping its 8-series chipsets later in the year and perhaps that will change things.
Despite the 45nm on-package GPU, the Core i5 661 actually draws more power at idle than the old G45 with an E8600. It's still much more power efficient than the equivalent from AMD. If you're building something with integrated graphics, you want it to be a Clarkdale.
To test power consumption under load I fired up a 1080p x264 video using Media Player Classic Home Cinema and measured total system power consumption:
A CPU swap and some tweaking later and our AMD power consumption numbers now make sense. While playing H.264 encoded video the GPU does all of the heavy lifting and there's no power advantage for Clarkdale to rest on. When watching a movie the AMD system is indistinguishable from our Clarkdale test bed.
93 Comments
View All Comments
Anand Lal Shimpi - Monday, January 4, 2010 - link
You're very right, it appears to be a side effect of the ASUS H57 board looking at everyone else's results. I'm out in Vegas at CES now but I'll run some numbers on Intel's H55 when I return this weekend.Take care,
Anand
plonk420 - Monday, January 4, 2010 - link
this power consumption is a bit weird... i've actually done the same testpc power & cooling 750 watt
i7-920, ex58-ud3r, 3x1 DDR3, hd5870, 2 hdds, 3 fans, HT on
default voltage: 215 watts in p95 (122 watts idle, no powersave mode)
undervolted 1.125v: 187 watts in p95 (121 watts idle, no powersave mode)
default, 8600GT: 211 watts in p95 (116 watts idle, no powersave mode), 184 watts on "sane load" (distributed computing)
1.125v, 8600GT: 183 watts in p95 (115 watts idle, no powersave mode), 164 watts on "sane load" (distributed computing)
i7-860, ex-p55m-ud2, 4x1 DDR3, 5870, 2 hdds, 2 fans, HT on
default voltage: (either i lost results, or i never tested them)
undervolted 1.025v: 167 watts p95 (107 watts idle, 101 watts in power saver), 149 watts on enigma@home (8 instances)
Spoelie - Monday, January 4, 2010 - link
The Load Power Consumption on page 4 also raises questions.The Phenom system rises 90w to decode a x264 movie, while the clarkdale system only rises 20w. It seems to me that the clarkdale system had DXVA support on while the Phenom system had it off..
Can someone check/confirm this?
Anand Lal Shimpi - Monday, January 4, 2010 - link
I was caught off guard by it too but DXVA was enabled. I'm currently out in Vegas for CES but when I return I'll give it another look in our Core i3 review. I've had issues with power consumption being stuck at unusually high levels on AMD boards in the past, but I couldn't get this one to shake in time.Take care,
Anand
MrAwax - Monday, January 4, 2010 - link
HD Codec bitstreaming has been added in HDMI 1.3 specs at the receiver manufacturer request for no reason except that they did want to lose their market.Since HD codecs are LOSSLESS, decoding them in the player or in the receiver makes NO difference. And HDMI supports streaming of 8 uncompressed channels in LPCM @192kHz/24b since 1.0. So digitally, there is ZERO POINT ZERO difference. This is the reason it is useless. And this is the reason HDMI added 8 channels of high resolution audio so you won't need to upgrade your receiver !
And this is stupid because bitstreaming is a LIMITATION of feature, not an added feature. BluRay norm supports in player audio mixing. A lot of discs are already supporting it. The player can mix sounds live on the main soundtrack. This is useful for adding dynamic menu sounds or director/actor commentary. In theory, the disc could even have a single music/fx soundtrack and dynamically mix voice to support multiple langage and save space on disc (voice is generally stereo, is not always present and reencoding every time the music/fx is a waste of storage). With HDMI bitstreaming, you can stream the main track only. Gone is the menu fx sound, gone is the bonus commentary and gone is the voice.
Welcome to receiver manufacturer lobbying for USELESS and STUPID feature.
PS: on the contrary, 8 channels hires LPCM is a great feature.
salacious - Monday, January 4, 2010 - link
Bitstreaming does have the disadvantages of loosing audio track mixing but it does offer something.If the decoding is done in the PC were are relying it to not to downsample to 48kHz/16bit which happens unless you have the correct combination of player and audio card. If the decoding is done in the PC you are relying that it decodes to the correct speaker and I have found that with a 7.1 speaker setups this is a complete mess.
Also if you want to apply any receiver effects to the audio then over HDMI you tend to be limited. If the movie is 5.1 and it is EX encoded and you have a 6.1/7.1 speaker setup, then the usual solution is to apply this processing in the receiver but if you set your PC to 7.1 then it sends 2 blank audio channels and you can't apply EX processing. You have the same problem is you play back a surround encoded stereo track on a 5.1 speaker system, all that happens is that the audio is output from the left/right speakers and you are unable to apply pro-logic decoding to it.
A solution would be for the PC to offer Pro-logic IIx and other types of decoding but as they don't then you need the receivers to do it which means bitsreaming.
FlyTexas - Monday, January 4, 2010 - link
...Is this really needed?Ok, so more speed is good, I wouldn't turn it down... ...however anyone who uses computers in the office environment knows that CPUs have been "fast enough" for awhile now.
We currently have a dozen Dell Vostro 200's with Pentium Dual Core 1.6ghz CPUs, and a dozen Dell Vostro 220's with Pentium Dual Core 2.4ghz CPUs in the office (among a few other oddball machines and the server).
These computers run Office 2007, Adobe Acrobat 9, IE 8, etc. Some of them also run Quickbooks and a few other programs. None are used for video encoding, games, or anything that fancy.
You know what? The difference between the 1.6ghz machines and the 2.4ghz machines isn't all that noticeable, once everything is up and running (they all have 2GB of RAM, running Windows XP Pro SP3). They all have the Intel GMA graphics, and for the office, they are all plenty fast.
Why would our company upgrade to these new CPUs?
The Pentium Dual Core CPUs were a nice jump over the Pentium 4 line. (we used to have all Pentium 4 machines back in 2006) This new line of CPUs doesn't seem like the same kind of improvement.
tomaccogoats - Monday, January 4, 2010 - link
I only look at new cpu's for gaming :plowlight - Monday, January 4, 2010 - link
You might notice a difference if they were running Windows 7 with 4GB of ram. Right now they are all a bit bottlenecked by the OS (poor multitasking performance) and low amount of RAM.But in general I agree with what you're saying.
FlyTexas - Monday, January 4, 2010 - link
I have Windows 7 installed on my computers at home, I haven't moved the office machines due to a lack of any good reason to do so.We have Windows 2003 R2 Small Business Server at work. Everyone is on user accounts on a domain. We have a hardware firewall running as well as the usual antivirus/spam/etc. stuff running.
It all just works, I have no desire to rip it all apart, spend a lot of the company's money, to gain... well I'm not sure what we'd gain. If I can't think of a good reason to do it, I sure can't sell it to the boss.
This isn't a knock against Microsoft, I personally love Windows 7, it is great for my home computer, but it doesn't do anything for the work computers. In a larger company, I can see the benefits of moving to it (and Server 2008), but we just don't have that large of a network (or budget).
As a side note: You wouldn't have wanted to see the mess that was here when I got here, it was all running on a wireless linksys router on an unencrypted network. I managed to get the office wired with gigabit ethernet and turn off the wireless. Got a good server in place, setup a Dell account, and moved out most of the oddball machines (we did two lease purchases of machines, one for each dozen of the Vostros, got a heck of a deal on them too). Of course I sold this to the boss by saying that it would all last at least 5 years, and we're about 2 years into that 5 year plan. :)
Maybe we'll just skip Windows 7 and move to 8 when it comes out.