AMD in Consumer Electronics
The potential of Fusion extends far beyond the PC space and into the embedded space. If you can imagine a very low power, low profile Fusion CPU, you can easily see it being used in not only PCs but consumer electronics devices as well. The benefit is that your CE devices could run the same applications as your PC devices, truly encouraging and enabling convergence and cohabitation between CE and PC devices.
Despite both sides attempting to point out how they are different, AMD and Intel actually have very similar views on where the microprocessor industry is headed. Both companies have stated to us that they have no desire to engage in the "core wars", as in we won't see a race to keep adding cores. The explanation for why not is the same one that applied to the GHz race: if you scale exclusively in one direction (clock speed or number of cores), you will eventually run into the same power wall. The true path to performance is a combination of increasing instruction level parallelism, clock speed, and number of cores in line with the demands of the software you're trying to run.
AMD has been a bit more forthcoming than Intel in this respect by indicating that it doesn't believe that there's a clear sweet spot, at least for desktop CPUs. AMD doesn't believe there's enough data to conclude whether 3, 4, 6 or 8 cores is the ideal number for desktop processors. From our testing with Intel's V8 platform, an 8-core platform targeted at the high end desktop, it is extremely difficult finding high end desktop applications that can even benefit from 8 cores over 4. Our instincts tell us that for mainstream desktops, 3 - 4 general purpose x86 cores appears to be the near term target that makes sense. You could potentially lower the number of cores needed if you combine other specialized hardware (e.g. an H.264 encode/decode core).
What's particularly interesting is that many of the same goals Intel has for the future of its x86 processors are in line with what AMD has planned. For the past couple of IDFs Intel has been talking about bringing to market a < 0.5W x86 core that can be used for devices that are somewhere in size and complexity between a cell phone and an UMPC (e.g. iPhone). Intel has committed to delivering such a core in 2008 called Silverthorne, based around a new micro-architecture designed for these ultra low power environments.
AMD confirmed that it too envisions ultra low power x86 cores for use in consumer electronics devices, areas where ARM or other specialized cores are commonly used. AMD also recognizes that it can't address this market by simply reducing clock speed of its current processors, and thus AMD mentioned that it is working on a separate micro-architecture to address these ultra low power markets. AMD didn't attribute any timeframe or roadmap to its plans, but knowing what we know about Fusion's debut we'd expect a lower power version targeted at UMPC and CE markets to follow.
Why even think about bringing x86 cores to CE devices like digital TVs or smartphones? AMD offered one clear motivation: the software stack that will run on these devices is going to get more complex. Applications on TVs, cell phones and other CE devices will get more complex to the point where they will require faster processors. Combine that with the fact that software developers don't want to target multiple processor architectures when they deliver software for these CE devices, and by using x86 as the common platform between CE and PC software you end up creating an entire environment where the same applications and content can be available across any device. The goal of PC/CE convergence is to allow users to have access to any content, on any device, anywhere - if all the devices you're trying to gain access to content/programs on happen to all be x86, it makes the process much easier.
Why is a new core necessary? Although x86 can be applied to virtually any market segment, the range of usefulness of a particular core can extend throughout an order of magnitude of power. For example, AMD's current desktop cores can easily be scaled up or down to hit TDPs in the 10W - 100W range, but they would not be good for hitting something in the sub-1W range. AMD can easily address the sub-1W market, but it will require a different core from what it addresses the rest of the market with. This philosophy is akin to what Intel discovered with Centrino; in order to succeed in the mobile market, you need a mobile specific design. To succeed in the ultra mobile and handtop markets, you need an ultra mobile/handtop specific processor design as well. Both AMD and Intel realize this, and now both companies have publicly stated that they are doing something about it.
The potential of Fusion extends far beyond the PC space and into the embedded space. If you can imagine a very low power, low profile Fusion CPU, you can easily see it being used in not only PCs but consumer electronics devices as well. The benefit is that your CE devices could run the same applications as your PC devices, truly encouraging and enabling convergence and cohabitation between CE and PC devices.
Despite both sides attempting to point out how they are different, AMD and Intel actually have very similar views on where the microprocessor industry is headed. Both companies have stated to us that they have no desire to engage in the "core wars", as in we won't see a race to keep adding cores. The explanation for why not is the same one that applied to the GHz race: if you scale exclusively in one direction (clock speed or number of cores), you will eventually run into the same power wall. The true path to performance is a combination of increasing instruction level parallelism, clock speed, and number of cores in line with the demands of the software you're trying to run.
AMD has been a bit more forthcoming than Intel in this respect by indicating that it doesn't believe that there's a clear sweet spot, at least for desktop CPUs. AMD doesn't believe there's enough data to conclude whether 3, 4, 6 or 8 cores is the ideal number for desktop processors. From our testing with Intel's V8 platform, an 8-core platform targeted at the high end desktop, it is extremely difficult finding high end desktop applications that can even benefit from 8 cores over 4. Our instincts tell us that for mainstream desktops, 3 - 4 general purpose x86 cores appears to be the near term target that makes sense. You could potentially lower the number of cores needed if you combine other specialized hardware (e.g. an H.264 encode/decode core).
What's particularly interesting is that many of the same goals Intel has for the future of its x86 processors are in line with what AMD has planned. For the past couple of IDFs Intel has been talking about bringing to market a < 0.5W x86 core that can be used for devices that are somewhere in size and complexity between a cell phone and an UMPC (e.g. iPhone). Intel has committed to delivering such a core in 2008 called Silverthorne, based around a new micro-architecture designed for these ultra low power environments.
AMD confirmed that it too envisions ultra low power x86 cores for use in consumer electronics devices, areas where ARM or other specialized cores are commonly used. AMD also recognizes that it can't address this market by simply reducing clock speed of its current processors, and thus AMD mentioned that it is working on a separate micro-architecture to address these ultra low power markets. AMD didn't attribute any timeframe or roadmap to its plans, but knowing what we know about Fusion's debut we'd expect a lower power version targeted at UMPC and CE markets to follow.
Why even think about bringing x86 cores to CE devices like digital TVs or smartphones? AMD offered one clear motivation: the software stack that will run on these devices is going to get more complex. Applications on TVs, cell phones and other CE devices will get more complex to the point where they will require faster processors. Combine that with the fact that software developers don't want to target multiple processor architectures when they deliver software for these CE devices, and by using x86 as the common platform between CE and PC software you end up creating an entire environment where the same applications and content can be available across any device. The goal of PC/CE convergence is to allow users to have access to any content, on any device, anywhere - if all the devices you're trying to gain access to content/programs on happen to all be x86, it makes the process much easier.
Why is a new core necessary? Although x86 can be applied to virtually any market segment, the range of usefulness of a particular core can extend throughout an order of magnitude of power. For example, AMD's current desktop cores can easily be scaled up or down to hit TDPs in the 10W - 100W range, but they would not be good for hitting something in the sub-1W range. AMD can easily address the sub-1W market, but it will require a different core from what it addresses the rest of the market with. This philosophy is akin to what Intel discovered with Centrino; in order to succeed in the mobile market, you need a mobile specific design. To succeed in the ultra mobile and handtop markets, you need an ultra mobile/handtop specific processor design as well. Both AMD and Intel realize this, and now both companies have publicly stated that they are doing something about it.
55 Comments
View All Comments
osalcido - Sunday, May 13, 2007 - link
Like they price gouged when X2s were introduced?If so, I hope Intel kills them off. If I'm gonna get gouged I'd rather it be by a monopoly so maybe the government will do something about it
Spoelie - Sunday, May 13, 2007 - link
there was really no price gouging as far as I know. AMD was capacity constrained, they were selling every possible cpu they could make at those prices, and even had backorders. I remember that around the summer of 2005, they had sold out their capacity for at least the next 6 months.It would be absolutely economically INSANE to lower prices under those conditions. If you sell every single cpu you can make, you're not gonna lower prices to increase demand..
But well yeah, around feb 2006 came the news of core 2 ;)
Kougar - Sunday, May 13, 2007 - link
I'm not sure, but whatever image protection you are using to protect direct image linking seems to now be breaking ALL images in Opera. I have had problems in the past regarding Anandtech review/article images, but chalked it up to a browser setting I could not pin down.So far it still only happens with Anandtech images, and after a full reinstall of Opera 9.20 I still can't see any images or any image placeholders in this article. I did not know there was even any images until I got to page 7 where the captions were left hanging in midpage. I really hate having to switch to IE7 to read articles, so if this can be easily fixed I'd very much appreciate it. Thanks.
If it helps any, if I am looking for it I can sometimes spot an image start to load, before it is near instantly removed from the page and the text reshuffled to fill the empty void.
bigbrent88 - Sunday, May 13, 2007 - link
I know this may be a simple way to look at AMD's Fusion and future chips based on that idea, but isn't this close to what the Cell already is. Imagine you could remake the cell with a current C2D(using current power leader) and include more, better SPE's with something like HT in AM2 and all of this is on a smaller die than you could do now. Would that not be the basic first step they are going to take? Many have said the Cell is ahead of its time and I also agree that some design elements are inhibiting its overall power, but the success of Folding shows what the Cells processing can do in these types of environments and thats what AMD is looking at in the near term.I just can't wait to drop my x2 3800 and get a good upgrade to go along with that new DX10 card sometime in the next year. Bring it AMD!
noxipoo - Friday, May 11, 2007 - link
get everything in focus for christ sakes.plonk420 - Friday, May 11, 2007 - link
it's interesting how many commercial programs aren't multithreaded. take a look at this year's Breakpoint demos/intros, and just about ALL the top 3 or 5 (or more) take advantage of 2 cores (i don't have more than 2 to know if they would make use of the extra ones or not). check out the Breakpoint 2007 entries at pouet.net and fire something up with a Task Manager open on a second monitor and see for yourself ;)OcHungry - Friday, May 11, 2007 - link
From the tone of your article I have no doubt AMD is about to put Intel where it belongs, in the so so technology arena with lots of marketing maneuvering to sell inferior products. I like the Jetliner graph where the air bus is taking off at a steep angle and the other small jet is going horizontal w/a little inclination. That says it all, and how the 2 (Intel and AMD) are perceived in the technology world.It’s like this: Intel refines the same old, but AMD is into innovation and new things. Good for AMD, it’s about time. The heteroggenenous architecture, the fusion, and Torenza, are where computing technology should be heading, and AMD is taking the lead, as always. I live in Austin, TX, and have a few friends working @ AMD and tell me: buy AMD shares as much as you can, because good things are about to explode and neither Intel, nor Nvidia can catch up to it, ever.
sandpa - Friday, May 11, 2007 - link
actually they are asking everybody to buy AMD shares so that they can sell off their worthless AMD stock for a better price :) dont listen to them ... they are not your friends. No one will be able to catch up with AMD "ever" ??? yeah keep dreaming fanboi!OcHungry - Friday, May 11, 2007 - link
Yeah right. Tell that to Fidelity who bought more of AMD shares lately (13% total).And I guess the rise in price yesterday and today were meaningless?
Intel marketing thugs are at work, no change there.
http://www.theinquirer.net/default.aspx?article=39...">http://www.theinquirer.net/default.aspx?article=39...
yyrkoon - Friday, May 11, 2007 - link
This is exactly what I was thinking while reading, then I ran into the above paragraph, and my suspicions were 'reinforced'. However, if this is the case, I can not help but wonder what will happen to nVidia. Will nVidia end up like 3dfx ? I guess only time will tell. There is a potential problem I am seeing here however, if we do finally get integrated graphics on the CPU die, what next ? Audio ? After a while this could be a problem for the consumer base, and may ressemble something along the lines of how a lot of Linux users view Microsoft, wit htheir 'Monopoly'. In the end, 'we' lose flexability, and possibly the freedom to choose what software that will actually run on our hardware. This is not to say, I buy into this beleif 100%, but it is a distinct possibility.
Apparently Intel suspects something is going on as well. One look at the current prices of the E6600 C2D should confirm this, as its currently half the price of what it was a month ago. Unless, there is something else I am missing, but the Extreme CPUs still seem to be hovering around ~$1000 usd.
I am very pleased to hear that AMD is continuing support for Socket AM2. It was my previous belief, that they were going to phase this socket out, for a newer socket, and if this was the case a few months ago, I am glad that they listen, and learn. Releasing products that underperform the competition is one thing, but alienating your user base is another . . . That being said, I really do hope that Barcelona/K10/whatever the hell the official name is, will give Intel some very tight competition (at least).
I can completely understand why AMD is being tight lipped, I have suspected the reasons why for some time now, and personally, I believe it to be in their best interrest to remain doing so. And yes, it may reflect badly on AMD at this point in time, but what would you preffer ? Intel learning your secretes, and thus rendering them moot, or a few 'whinners', such as ourselves, not knowing what is going on ? They are doing the right thing by them, and that is all that matters. No one, including Intel 'fanboys' want AMD to go under, they may think so, until it really does happen, then they are locked into whatever Intel deems nessisary, which is bad for everyone.
Now, if AMD could come up with something similar to vPro/AMT, or perhaps AMD/Intel could make a remote administration (BIOS, or similr level) 'Standard', I think I would be happy, for at least a little while . . .