Anand's Thoughts on Intel Canceling Larrabee Prime
by Anand Lal Shimpi on December 6, 2009 8:00 PM EST- Posted in
- GPUs
Larrabee is Dead, Long Live Larrabee
Intel just announced that the first incarnation of Larrabee won't be a consumer graphics card. In other words, next year you're not going to be able to purchase a Larrabee GPU and run games on it.
You're also not going to be able to buy a Larrabee card and run your HPC workloads on it either.
Instead, the first version of Larrabee will exclusively be for developers interested in playing around with the chip. And honestly, though disappointing, it doesn't really matter.
The Larrabee Update at Fall IDF 2009
Intel hasn't said much about why it was canceled other than it was behind schedule. Intel recently announced that an overclocked Larrabee was able to deliver peak performance of 1 teraflop. Something AMD was able to do in 2008 with the Radeon HD 4870. (Update: so it's not exactly comparable, the point being that Larrabee is outgunned given today's GPU offerings).
With the Radeon HD 5870 already at 2.7 TFLOPS peak, chances are that Larrabee wasn't going to be remotely competitive, even if it came out today. We all knew this, no one was expecting Intel to compete at the high end. Its agents have been quietly talking about the uselessness of > $200 GPUs for much of the past two years, indicating exactly where Intel views the market for Larrabee's first incarnation.
Thanks to AMD's aggressive rollout of the Radeon HD 5000 series, even at lower price points Larrabee wouldn't have been competitive - delayed or not.
I've got a general rule of thumb for Intel products. Around 4 - 6 months before an Intel CPU officially ships, Intel's partners will have it in hand and running at near-final speeds. Larrabee hasn't been let out of Intel hands, chances are that it's more than 6 months away at this point.
By then Intel wouldn't have been able to release Larrabee at any price point other than free. It'd be slower at games than sub $100 GPUs from AMD and NVIDIA, and there's no way that the first drivers wouldn't have some incompatibly issues. To make matters worse, Intel's 45nm process would stop looking so advanced by mid 2010. Thus the only option is to forgo making a profit on the first chips altogether rather than pull an NV30 or R600.
So where do we go from here? AMD and NVIDIA will continue to compete in the GPU space as they always have. If anything this announcement supports NVIDIA's claim that making these things is, ahem, difficult; even if you're the world's leading x86 CPU maker.
Do I believe the 48-core research announcement had anything to do with Larrabee's cancelation? Not really. The project came out of a different team within Intel. Intel Labs have worked on bits and pieces of technologies that will ultimately be used inside Larrabee, but the GPU team is quite different. Either way, the canceled Larrabee was a 32-core part.
A publicly available Larrabee graphics card at 32nm isn't guaranteed, either. Intel says they'll talk about the first Larrabee GPU sometime in 2010, which means we're looking at 2011 at the earliest. Given the timeframe I'd say that a 32nm Larrabee is likely but again, there are no guarantees.
It's not a huge financial loss to Intel. Intel still made tons of money all the while Larrabee's development was underway. Its 45nm fabs are old news and paid off. Intel wasn't going to make a lot of money off of Larrabee had it sold them on the market, definitely not enough to recoup the R&D investment, and as I just mentioned using Larrabee sales to pay off the fabs isn't necessary either. Financially it's not a problem, yet. If Larrabee never makes it to market, or fails to eventually be competitive, then it's a bigger problem. If heterogenous multicore is the future of desktop and mobile CPUs, Larrabee needs to succeed otherwise Intel's future will be in jeopardy. It's far too early to tell if that's worth worrying about.
One reader asked how this will impact Haswell. I don't believe it will, from what I can tell Haswell doesn't use Larrabee.
Intel has a different vision of the road to the CPU/GPU union. AMD's Fusion strategy combines CPU and GPU compute starting in 2011. Intel will have a single die with a CPU and GPU on it, but the GPU isn't expected to be used for much compute at that point. Intel's roadmap has the CPU and AVX units being used for the majority of vectorized floating point throughout 2011 and beyond.
Intel's vision for the future of x86 CPUs announced in 2005, surprisingly accurate
It's not until you get in the 2013 - 2015 range that Larrabee even comes into play. The Larrabee that makes it into those designs will look nothing like the retail chip that just got canceled.
Intel's announcement wasn't too surprising or devastating, it just makes things a bit less interesting.
75 Comments
View All Comments
Minion4Hire - Monday, December 7, 2009 - link
MORE developers.... Sony tried to get game studios to develop for the Cell, but a lack of development meant that there were few reasons to buy a PS3 in the first year or two after its release (unless you wanted a Blu-Ray player). Intel could have a more successful launch if they can familiarize Larrabee with and receive feedback from as many developers as possible.And I never said that holding back will "improve Larrabee at a faster rate", I said that they'll have fewer problems with the product when they do launch it.
Seriously, you seem to be trying too hard. Intel has working hardware but they don't feel it's ready. I'm willing to bet their major hurdle right now is software more than hardware, but either way if it isn't ready then it isn't ready. Trying to break into a new market with a half-done product would be a stupider move than delaying it until it IS ready.
LaughingTarget - Monday, December 7, 2009 - link
"Intel could have a more successful launch if they can familiarize Larrabee with and receive feedback from as many developers as possible."Here's the rub with this hypothesized plan. Familiarizing ones self with a product isn't some simple matter. This requires taking people away from productive products to simply learn another option. What business in its right mind would do this? This would amount to Intel getting free testers. Moving personnel over to new chipsets is an expensive endeavor and companies won't simply test Larrabee, espeically since the when, or even if, of its release on a mass scale. As it stands, "getting familiar" is either dumping cash into a huge hole or completely upside down when present values of the investment are considered.
This is why it took developers a long time to get ramped up on the PS3 and Cell. Sony had to foot the bill to get companies like Naugty Dog, Insomniac, and Konami to put out major games that actually took advantage of Cell. Cell is showing to be totally dead in the water everywhere else. Consumer computing, the big dollars, hasn't picked up on it and likely never will. As such, Sony had to show developers personally that money could be made by investing in the skills to properly use Cell, even though after the PS3 it's likely a dead technology.
This is the only way Intel can get Larrabee off the ground. This notion that software development companies are going to waste valuable resources beta testing an unknown quality needs to be dropped. Intel has to fund and market a product that uses Larrabee and prove that it can make it because, otherwise, all we have to work on is Intel's history, which is far from something Intel wants to advertise when it comes to graphics processing.
Zool - Monday, December 7, 2009 - link
The thing us that with a full x86 core + graphic they will never EVER catch a pure GPU like radeon 5800. They will be ALWAYS behind for size and cost, only if people wouldnt care about price it could work (and that wont happen). Its like running a graphic trough some emulator, they will always need more ressources than a pure GPU. Not to mention that the performance in new games will be higly driver dependant. All games would need a specific driver aproach to run at full speed.The0ne - Monday, December 7, 2009 - link
You know, there are companies out there that uses IGP to run their machines. And some of these companies are quite big. There isn't a huge need to for graphics power, just a means to display something on the screen.Case in point, in a previous company I worked for years ago, we had a contract with Arrow/Intel to purchase their PCs for our mfg needs. We had hundreds monthly. Why don't we want a better graphics card? Well for one we're running OS/2 Warp :) But most importantly, there wasn't a need at all...not in the banking/financial industry. Even now the servers I built to process all the financial transactions use plain old IGP.
To make matter even worse, we were paying for hardware that exceeded the requirements of the system itself hahaha. What we really needed was a bigger bandwidth, of which fiber optic was designed in as the bus. But the CPU speed, the RAM, the graphics and especially the storage were all ridiculously un-needed. But you can't get "older" generations so you pay for what's current. Sucks when you're in the position.
All I'm saying is it's good that Intel wants to better the chipset but it's even better that they are taking their time to work things out. The market will be there when it does come out, they're not going anywhere.
Zool - Monday, December 7, 2009 - link
Larrabee selling as a 100% discret graphic is FAIL from the 1 day.Larrabee as a discret graphic + HPC CPU could actualy work.
So they just canceled the pure discret graphic thing to save themself from another itanic.
rolodomo - Sunday, December 6, 2009 - link
to compete with AMD Fusion, especially several years out. Intel tried to do it themselves, and they couldn't pull it off. They probably have much more respect for Nvidia now.AMD/ATI v. Intel/Nvidia
dagamer34 - Monday, December 7, 2009 - link
Have you been living under a rock? nVidia doesn't want to be purchased, and certainly not by Intel. And there are so many regulator hurdles that would have to be jumped that it would never get pass regulations. Realize that they'd own 80% of the GPU market plus a huge part of the chipset market. And once AMD bought ATI, they stopped making Intel chipsets.JHBoricua - Monday, December 7, 2009 - link
Nvidia has 80% of the GPU market? Only in their PR department. Otherwise, citations needed.martinw - Monday, December 7, 2009 - link
I think the previous poster meant that Intel+Nvidia together have around 80% of the GPU market.Khato - Sunday, December 6, 2009 - link
[QUOTE]Intel hasn't said much about why it was canceled other than it was behind schedule. Intel recently announced that an overclocked Larrabee was able to deliver peak performance of 1 teraflop. Something AMD was able to do in 2008 with the Radeon HD 4870.[/QUOTE]The overclocked Larrabee demonstration was an actual performance of 1 TFLOP on a SGEMM 4K x 4K calculation compared to the theoretical peak of 2 TFLOP, no? One article regarding the demonstration stated that the same calculation run on a NVIDIA Tesla C1060 (GT200) comes in at 370 GFLOP, while AMD FireStream 9270 (RV770) is only 300 GFLOP. So it's not something AMD was able to do in 2008, unless crossfire is taken into consideration at which point it might be possible. Anyway, point is that the hardware isn't comparable to the GT200/RV770 generation.