Anand's Thoughts on Intel Canceling Larrabee Prime
by Anand Lal Shimpi on December 6, 2009 8:00 PM EST- Posted in
- GPUs
Larrabee is Dead, Long Live Larrabee
Intel just announced that the first incarnation of Larrabee won't be a consumer graphics card. In other words, next year you're not going to be able to purchase a Larrabee GPU and run games on it.
You're also not going to be able to buy a Larrabee card and run your HPC workloads on it either.
Instead, the first version of Larrabee will exclusively be for developers interested in playing around with the chip. And honestly, though disappointing, it doesn't really matter.
The Larrabee Update at Fall IDF 2009
Intel hasn't said much about why it was canceled other than it was behind schedule. Intel recently announced that an overclocked Larrabee was able to deliver peak performance of 1 teraflop. Something AMD was able to do in 2008 with the Radeon HD 4870. (Update: so it's not exactly comparable, the point being that Larrabee is outgunned given today's GPU offerings).
With the Radeon HD 5870 already at 2.7 TFLOPS peak, chances are that Larrabee wasn't going to be remotely competitive, even if it came out today. We all knew this, no one was expecting Intel to compete at the high end. Its agents have been quietly talking about the uselessness of > $200 GPUs for much of the past two years, indicating exactly where Intel views the market for Larrabee's first incarnation.
Thanks to AMD's aggressive rollout of the Radeon HD 5000 series, even at lower price points Larrabee wouldn't have been competitive - delayed or not.
I've got a general rule of thumb for Intel products. Around 4 - 6 months before an Intel CPU officially ships, Intel's partners will have it in hand and running at near-final speeds. Larrabee hasn't been let out of Intel hands, chances are that it's more than 6 months away at this point.
By then Intel wouldn't have been able to release Larrabee at any price point other than free. It'd be slower at games than sub $100 GPUs from AMD and NVIDIA, and there's no way that the first drivers wouldn't have some incompatibly issues. To make matters worse, Intel's 45nm process would stop looking so advanced by mid 2010. Thus the only option is to forgo making a profit on the first chips altogether rather than pull an NV30 or R600.
So where do we go from here? AMD and NVIDIA will continue to compete in the GPU space as they always have. If anything this announcement supports NVIDIA's claim that making these things is, ahem, difficult; even if you're the world's leading x86 CPU maker.
Do I believe the 48-core research announcement had anything to do with Larrabee's cancelation? Not really. The project came out of a different team within Intel. Intel Labs have worked on bits and pieces of technologies that will ultimately be used inside Larrabee, but the GPU team is quite different. Either way, the canceled Larrabee was a 32-core part.
A publicly available Larrabee graphics card at 32nm isn't guaranteed, either. Intel says they'll talk about the first Larrabee GPU sometime in 2010, which means we're looking at 2011 at the earliest. Given the timeframe I'd say that a 32nm Larrabee is likely but again, there are no guarantees.
It's not a huge financial loss to Intel. Intel still made tons of money all the while Larrabee's development was underway. Its 45nm fabs are old news and paid off. Intel wasn't going to make a lot of money off of Larrabee had it sold them on the market, definitely not enough to recoup the R&D investment, and as I just mentioned using Larrabee sales to pay off the fabs isn't necessary either. Financially it's not a problem, yet. If Larrabee never makes it to market, or fails to eventually be competitive, then it's a bigger problem. If heterogenous multicore is the future of desktop and mobile CPUs, Larrabee needs to succeed otherwise Intel's future will be in jeopardy. It's far too early to tell if that's worth worrying about.
One reader asked how this will impact Haswell. I don't believe it will, from what I can tell Haswell doesn't use Larrabee.
Intel has a different vision of the road to the CPU/GPU union. AMD's Fusion strategy combines CPU and GPU compute starting in 2011. Intel will have a single die with a CPU and GPU on it, but the GPU isn't expected to be used for much compute at that point. Intel's roadmap has the CPU and AVX units being used for the majority of vectorized floating point throughout 2011 and beyond.
Intel's vision for the future of x86 CPUs announced in 2005, surprisingly accurate
It's not until you get in the 2013 - 2015 range that Larrabee even comes into play. The Larrabee that makes it into those designs will look nothing like the retail chip that just got canceled.
Intel's announcement wasn't too surprising or devastating, it just makes things a bit less interesting.
75 Comments
View All Comments
Christobevii3 - Monday, December 7, 2009 - link
I was hoping they'd release this as an add in card for servers. Since it uses general processors, having an add in card with 32 or what not mediocre cpu's and being able to allocate them to virtual servers would be amazing.pugster - Monday, December 7, 2009 - link
I figured that Intel would cancel Larabee because of cost. An Intel core2duo wolfdale processor with 6mb of l2 cache has 410 million transistors and would easily sell for $200. My cheap geforce 9600gso would probably sell for $70 and the gpu itself has 505 transistors. I figure that even if Intel makes competitive gpus, they would barely eek out a profit selling these larabee gpus.Robear - Monday, December 7, 2009 - link
At this point I wouldn't be all that surprised to hear an announcement for a new discrete graphics microarchitecture. Intel has an advantage with nVidia and AMD both being fabless, and both using TSMC. A whole article could be written about that alone.Larabee was heckled around the nVidia/AMD circles at the notion of a software-driven, general purpose GPU. I wanted to believe that Intel had an ace up its sleeve, but nVidia and AMD have been doing this for a while. You couldn't just dismiss what they were saying.
The whole "fusion" architecture seems to be the future of the CPU / GPU, so it would surprise me if Intel dropped out of the discrete business completely. It makes sense to invest in graphics, and it makes sense to offset your R&D costs by selling your product in another market (i.e. mobile graphics -> discrete).
If I was to guess, I would assume that after performance profiling their engineers came about to "If we had purpose-specific hardware we could ..." You can use your imagination there.
Anyway, I would expect a new microarchitecture to be announced in the next 6 months or so. That might explain why they didn't kill larrabee completely. We'll probably see a GP, many-core architecture with purpose-specific instructions (Like SSE). x86 would be good, and I can see where Intel would benefit from that, but I wouldn't put it past Intel to go with a new instruction set specific to GPUs. I could see that instruction set quickly becoming a standard, too.
x86 instructions are good, but this instruction set fares well with multi-tasking branch-heavy threads, integer computing, and general I/O. GPUs have a very heavy FP workload that's usually streaming, and this has been handled with x86 extensions. I can see x86 support being a burden on the overall architecture, so it seems only natural to take lessons learned and start over.
Intel knows instruction sets. Intel knows compilers. Intel knows how to fab. My disappointment in not seeing an Intel GPU this year is mitigated by my excitement to see what Intel will be coming out with in its place.
Scali - Tuesday, December 8, 2009 - link
Larrabee already HAS purpose-specific instructions (LRBni).It's not that different from what nVidia does with Fermi, really. It's just that nVidia doesn't use the x86 ISA as their basis. Doesn't matter all that much when you use C++ anyway.
Shadowmaster625 - Monday, December 7, 2009 - link
Did anybody catch the innuendo in their choice of game? ... Enemy Territory!~ har har harUltraWide - Monday, December 7, 2009 - link
What about the angle that intel is still being investigated for anti-competitive business practices? No one has talked about this tidbit.The0ne - Monday, December 7, 2009 - link
I don't know about you but I am not surprise nor will be if they get investigated. But then neither am I about other companies trying to win market share. Dell does it, AMD does it, MS does it, Apple is the king of it all, etc.It's a topic that's not related to this topic as it's huge by itself.
Targon - Monday, December 7, 2009 - link
Intel has a problem, and that is that while they can do a VERY good job when focused on making ONE very good product, they don't do a very good job in multiple areas.Look at processors....Intel has been focused on making the best processors out there, but really, they don't make ANYTHING else that is really amazing. Chipset quality is good, because that SUPPORTS their CPU business. GPU designs are pathetic and so far behind AMD and NVIDIA that I won't recommend ANY system with an Intel graphics product. Their networking products are also less than amazing.
So, does Intel do ANYTHING really well other than processors and chipsets? Even when they try to branch out into other areas and invest a LOT of money, the results have yet to impress anyone.
Even when they have tried to come up with an all new design it has flopped EVERY time. The last successful all new design was the Pentium Pro(even though the original flopped, the design ended up in the Pentium 2 and eventually into the Core and Core 2 line of processors)....so, Intel has come up with THREE really original designs. The old 8086/8088, the Pentium, and the Pentium Pro. Itanium flopped, and what else will Intel do.
Scali - Tuesday, December 8, 2009 - link
Uhhh... the 286, 386 and 486 were all completely unique architectures aswell, and all were very successful.As for Intel's IGP... Sure, Intel goes for a design that is as cheap and simple as possible, but in that light they do well. Intel's IGPs are VERY cheap, and they support everything you might need... They have DX10 support, video acceleration, DVI/HDMI/HDCP and all that.
Given the amount of transistors that Intel spends on their IGPs and the number of processing units inside their IGPs, they are in line with what nVidia and AMD offer. They just apply the technology on a smaller scale (which is good for cost and for power consumption/heat, which matter more to a lot of people than overall performance).
somedude1234 - Monday, December 7, 2009 - link
OK, so Intel GPU's can't touch ATI/AMD or nVidia's best, we know this, we've always known this. Who has more GPU market share?You want a list of other things that Intel does well?
- Network controllers: some of the best money can buy.
- Consumer-level SSDs: Theirs are the best on the market in the benchmarks that matter the most for performance (random R/W). AFAIK, even the X25-E doesn't have a super-cap or other cache protection, so I don't consider that to be a truly "enterprise" drive.
- NAND Flash memory
Just to name a few examples.