Intel's Larrabee Architecture Disclosure: A Calculated First Move
by Anand Lal Shimpi & Derek Wilson on August 4, 2008 12:00 AM EST- Posted in
- GPUs
Introduction
Oooh this is dangerous.
It started with Intel quietly (but not too quietly) informing many in the industry of its plans to enter the graphics market with something called Larrabee.
NVIDIA responded by quietly (but not too quietly) criticizing the nonexistant Larrabee.
What we've seen for the past several months has been little more than jabs thrown back and forth, admittedly with NVIDIA being a little more public with its swings. Today is a big day, without discussing competing architectures, Intel is publicly unveiling, for the first time, the basis of its Larrabee GPU architecture.
Well, it is important to keep in mind that this is first and foremost NOT a GPU. It's a CPU. A many-core CPU that is optimized for data-parallel processing. What's the difference? Well, there is very little fixed function hardware, and the hardware is targeted to run general purpose code as easily as possible. The bottom lines is that Intel can make this very wide many-core CPU look like a GPU by implementing software libraries to handle DirectX and OpenGL.
It's not quite emulating a GPU as it is directly implementing functionality on a data-parallel CPU that would normally be done on dedicated hardware. And developers will not be limited to just DirectX and OpenGL: this hardware can take pure software renderers and run them as if the hardware was designed specifically for that code.
There is quite a bit here, so let's just jump right in.
101 Comments
View All Comments
phaxmohdem - Monday, August 4, 2008 - link
Can your mom play Crysis? *burn*JonnyDough - Monday, August 4, 2008 - link
I suppose she could but I don't think she would want to. Why do you care anyway? Have some sort of weird fetish with moms playing video games or are you just looking for another woman to relate to?Ooooh, burn!
Griswold - Monday, August 4, 2008 - link
He is looking for the one playing his mom, I think.bigboxes - Monday, August 4, 2008 - link
Yup. He worded it incorrectly. It should have read, "but can it play your mom?" :pTilmitt - Monday, August 4, 2008 - link
I'm really disappointed that Intel isn't building a regular GPU. I doubt that bolting a load of unoptimised x86 cores together is going to be able to perform anywhere near as well as a GPU built from the ground up to accelerate graphics, given equal die sizes.JKflipflop98 - Monday, August 4, 2008 - link
WTF? Did you read the article?Zoomer - Sunday, August 10, 2008 - link
He had a point. More programmable == more transistors. Can't escape from that fact.Given equal number of transistors, running the same program, a more programmable solution will always be crushed by fixed function processors.
JonnyDough - Monday, August 4, 2008 - link
I was wondering that too. This is obviously a push towards a smaller Centrino type package. Imagine a powerful CPU that can push graphics too. At some point this will save a lot of battery juice in a notebook computer, along with space. It may not be able to play games, but I'm pretty sure it will make for some great basic laptops someday that can run video. Not all college kids and overseas marines want to play video games. Some just want to watch clips of their family back home.rudolphna - Monday, August 4, 2008 - link
as interesting and cool as this sounds, this is even more bad news for AMD, who was finally making up for lost ground. granted, its still probably 2 years away, and hopefully AMD will be back to its old self (Athlon64 era) They are finally getting products that can actually compete. Another challenger, especially from its biggest rival-Intel- cannot be good for them.bigboxes - Monday, August 4, 2008 - link
What are you talking about? It's been nothing but good news for AMD lately. Sure, let Intel sink a lot of $$ into graphics. Sounds like a win for AMD (in a roundabout way). It's like AMD investing into a graphics maker (ATI) instead of concentrating on what makes them great. Most of the Intel supporters were all over AMD for making that decision. Turn this around and watch Intel invest heavily into graphics and it's a grand slam. I guess it's all about perspective. :)