The IGP Chronicles Part 1: Intel's G45 & Motherboard Roundup
by Anand Lal Shimpi & Gary Key on September 24, 2008 12:00 PM EST- Posted in
- Motherboards
Competitive Integrated Graphics?
Since the G45 GMCH is built on a 65nm process, it can be larger than G35's GMCH - and thus Intel increased the number of unified shader processors from 8 to 10.
1024 x 768 | Intel G45 | Intel G35 |
Enemy Territory: Quake Wars | 6.8 fps | 5.6 fps |
Company of Heroes | 24.5 fps | 16.5 fps |
Race Driver GRID | 3.7 fps | 2.8 fps |
Age of Conan | 7.9 fps | 6.1 fps |
Crysis | 9.3 fps | 7.9 fps |
Spore | 10.8 fps | 9.7 fps |
Half Life 2 Episode Two | 40.7 fps | 27.9 fps |
Oblivion (800 x 600) | 22.7 fps | 14.7 fps |
These shader processors are nearly directly comparable to NVIDIA's, meaning that in terms of raw processing power the G45 GMCH has 1/24th the execution resources of NVIDIA's GeForce GTX 280. It gets even worse if you compare it to a more mainstream solution - take the recently announced GeForce 9500 GT for example, it only has 32 SPs - putting the G45 at around 1/3 the clock-for-clock power of a 9500 GT.
Then there's the clock speed issue. While the GeForce 9500 GT runs its array of SPs at 1.4GHz, Intel runs its shader processors at 800MHz. Both Intel and NVIDIA's architectures have a peak throughput of one shader instruction per clock, so while the 9500 GT has 3x the execution resources of the G45 GMCH, it also has 75% clock speed advantage giving it a 4.6x raw throughput advantage over the G45 GMCH.
But how about comparing Intel's graphics core to NVIDIA's IGP equivalent? The GeForce 8200 is NVIDIA's latest integrated graphics core, it has 8 SPs and runs them at a clock speed of 1.2GHz - giving NVIDIA a 20% advantage in raw throughput on paper.
There are many unknowns here however. NVIDIA has special execution units for transcendentals and it's unclear whether or not Intel has the same. There are also times when Intel must relegate vertex processing to the CPU, which can cause strange performance characteristcis. But the point is that Intel's latest graphics core, at least on paper, is competitive to what NVIDIA is offering.
Neither the Intel or NVIDIA solution can hold a candle to AMD's 780G, which has a peak throughput of 40 shader operations per clock compared to 10 for Intel and 8 for NVIDIA. The reason AMD can do so much more is that each of its 8 processing clusters is 5-wide, just like on its desktop GPUs. If there's enough parallel data to work on, each one of these clusters can output five shader instructions per clock. The real world utilization is somewhere between one and five depending on how efficient AMD's real time compiler is and the code being run, but this generally translates into AMD dominating the IGP performance charts even with a lower clock speed than both Intel and NVIDIA parts.
Does this all really matter?
This next point is one that I've quietly argued for the past few years. ATI and NVIDIA have always acted holier than thou because of their IGP performance superiority over Intel, but I argue that they are no better than the boys in blue.
Despite both ATI and NVIDIA being much faster than Intel, the overall gameplay experience delivered by their integrated graphics solutions is still piss poor. Even on older games. Try running Oblivion, a 2.5-year old title, on even AMD's 780G and you'll find that you have to run it at the lowest visual quality settings, at the lowest resolutions (800 x 600, max) to get playable frame rates. At those settings, the game looks absolutely horrible.
In those games that aren't visually demanding, performance doesn't actually matter and all three vendors end up doing just fine. Fundamentally both ATI and NVIDIA want to sell more discrete cards, so they aren't going to enable overly high performance integrated solutions. The IGP performance advantages in games amount to little more than a marketing advantage, since anyone who actually cares about gaming is going to be frustrated even by their higher performing integrated solution.
The area where ATI/NVIDIA deliver where Intel historically hasn't is in the sheer ability to actually run games. In the past, driver issues and just basic compatibility with most games was simply broken on Intel hardware. Intel tried to address much of that with G45.
There is one aspect of IGP performance that really matters these days however: video decode acceleration.
53 Comments
View All Comments
Imperor - Sunday, September 28, 2008 - link
Impressive how many people just rant on about the review being inadequate when they obviously didn't even read the start of it! If they did that they'd know that reviews of AMD and nVidia boards are coming up and that all will be compared eventually!I get the feeling that the people talking about "Intel fanbois" tend to have the same kind of appreciation of another brand...
Stating the obvious isn't being partial. It just so happens that AMD don't even come close to competing with Intel in the CPU department! Sure AMD might be cheaper, but there are cheap Intels out there as well. The whole platform tends to get a bit more expensive when you go with Intel but you get what you pay for. I'm perfectly happy with my G35+E2140. Does everything a computer is supposed to do but gaming. I'm not a gamer, so that is a non-issue for me.
Very tempted to go mini-ITX with 1,5TB HDD. Tiny box and lots of diskspace!
Found a nice case for it as well, Morex Venus 668. Not that I know anything about it really but it'll hold up to 3 HDDs and a full size ODD and probably house decent cooling for the CPU while still being tiny (~8"x9"x13").
robg1701 - Saturday, September 27, 2008 - link
Do any of the boards support Dual-Link DVI?Im getting a bit sick of having to include a video card in otherwise low power boxes in order to drive my 30" monitor :)
deruberhanyok - Friday, September 26, 2008 - link
[quote]We struggled with G45 for much of the early weeks of its release, but the platform wasn't problem-free enough for a launch-day review.[/quote]You weren't serious here, were you? That basically says "The chipset had problems so we didn't want to write a review talking about them."
piesquared - Friday, September 26, 2008 - link
Does this sight have an ounce of integrity left? I seriously doubt it. Nothing but Intel pandering left here. You "reviewers" have the gaul to do a review of this attempt at an IGP, yet fail to show any review of either an AMD IGP if it proves how inverior G45 is. Are you seriously implying that people are so stupid that they aren't capable of seeing through this BS? I remember something about a SB750 promise somewhere around 2 months ago that never materialized, then a 790gx promise that never materialized, then another 790gx roundup, that not only never materialized, but the DFI preview article seems to have actually vanished, then the AMD IGP part II looks to be delayed or something, probably vanished due to Intel's poor performance.I am really really starting to wonder if AT was purchased by Intel. All evidence points to it. If not, then call a spade a spade and don't make promises you can't keep. I'm sure you think none of this matters because you're so popular that people will read no matter what you write here. I wouldn't be so confident if I were AT.
TA152H - Thursday, September 25, 2008 - link
I can tell you guys are really working on gaining that female readership. As everyone knows, women really go for that low-class, vulgar language.Also, who would want to get rid of PS/2 ports? Whoever on your staff wants this, better have something more than they hate anything legacy. Where's the logic in adding two extra USB ports so you can remove the PS/2 ports? It's not like it's more flexible, really, because you pretty much always need the keyboard and mouse. When's the last time you were in the situation where you said "Oh, I won't be needing my mouse and keyboard today, and I'm so strapped for USB ports, it's a good thing I can use the ones I normally use for the keyboard and mouse for something else". Doubtful you've ever said it, and if you have, you have issues deeper than I am capable of dealing with.
It's not like the keyboard or mouse work better in the USB port, or that it's somehow superior in this configuration. In fact, the PS/2 ports were made specifically for this, and are perfectly adequate for it. Didn't you guys know that USB has more overhead than the PS/2 ports? I guess not. So, you worry about fractions of a percent going from motherboard to motherboard with the same chipset, but you prefer to use a USB mouse and keyboard? I just do not understand that. USB was a nice invention of Intel to suck up CPU power so you'd need a faster processor. It's a pity this has been forgotten.
Sure, let's the replace the efficient with the inefficient, so we can say we're done with the legacy ports and we can all feel like we've moved forward. Yes, that's real progress we want. Good grief.
CSMR - Friday, September 26, 2008 - link
Yeah I had to get a quad core so I can dedicate one core to the USB mouse and one to the USB keyboard. Now I can type ultra fast and the mouse really zips around the screen.MrFoo1 - Thursday, September 25, 2008 - link
Non-integrated graphics cards are discrete, not discreet.discreet = modest/prudent/unnoticeable
discrete = constituting a separate entity
dev0lution - Thursday, September 25, 2008 - link
I really dislike the trend of recent reviews that go off on tangents about the state of the market, or particular vendor performance gripes and then the rest of the review doesn't even touch on relevant benchmarks or features to back up these rants. If you're going to complain about IGP performance from AMD or NVIDIA, you might want to back that up with at least ONE board being included in the comparison charts. Who cares if Intel G45 gets bad frame rates against itself (across the board to boot). Why not show how 3 IGP chipsets from the major vendors stack up against each other in something mainstream like Spore? If it's a G45 only review, how about you save the side comments for a true IGP roundup? Sorry, but if you have the time to post a "(p)review" that brings up competitive aspects with no benchmarks to balance out those comments, it's basically single-vendor propaganda - nothing in the conclusions deal with whether a IGP in the same price range from another vendor would fill the void that G45 clearly does not fill.Since when does issues at the release date mean you can't post the review? "We struggled with G45 for much of the early weeks of its release, but the platform wasn't problem-free enough for a launch-day review." - Ummm, might want to include that as disclosure in all your other post-launch day reviews!?! Or do other vendors get brownie points for being problem-free when you can actually buy the product?
Unfortunately, the inconsistency across multiple reviews make it somewhat difficult to compare competing products from multiple vendors because the methodology varies between single chipset and competitive benchmarks, even when you can separate the irrelevant introductory comments and bias from the particular author from the rest of the review.
More authors obviously does not equal consistency or more relevant reviews..
yyrkoon - Thursday, September 25, 2008 - link
Looking forward to your review of this board(if I understood you correctly), as I have been keeping an eye on this board for a while now. Perfect for an all around general use board(minus gaming of course), but would have been really REALLY nice if that 1x PCIe slot were a 16x PCIe with atleast 8x bandwidth. Hell I think i would settle with 4xPCIe speeds, just to have the ability to use an AMD/ATI 3650/3670 in this system. I think Jetway has a similar board with a 16x PCIe slot, slightly less features, at the cost of like $350 usd . . .Now if someone reputable (meaning someone who can actually make a solid board from the START *cough*ABIT*cough*) using the Core 2 mobile CPU, SO-DIMMs, etc, AT A REASONABLE PRICE . . . I think I might be in power consumption heaven. Running my desktop 'beast' tends to drain the battery banks dry ; )
iwodo - Wednesday, September 24, 2008 - link
I wonder if Anand could answer a few questions we have in our mind.Why with a generation Die Shrink we only get 2 extra Shader instead of like 4 - 6? Where did all the extra available die space went?
With the New Radeon HD 4x series, people have consistent result they can get single digit CPU usage when viewing 1080P H.264 with a E7xxx Series CPU, or slightly more then 15% when using an old Celeron. This is 2 - 3 times better then G45!!!! Even 780G is a lot better then G45 as well. So why such a HUGE difference in performance of so called Hardware Accelerated Decoding?