Intel Always Intentionally gimped their Atoms. Thank God they Failed Miserably or we would have seen only 10-15% nominal increase Smartphones Performance because Intel Wanted To Milk everyone.Guess What Happens Happens for Good. Intel Chips were Never as good as Competing Snapdragons and Apple AX series processors.and having used smartphones with intel SOCs it struck me that intel SOCs performed poorly compared to competitors.Heck even Mediatek makes better SOCs
Because they weren't selling the chips at a profit. How much money can they make supplying a chip for a $300 phone? And the 'contra revenue' bit means they were paying someone to use the chips. It was a gamble.
I think it's a shocker and shame. I really hoped they could make some deal with Google so that x86 phones would be standard and Google could actually update their phones OSs. Now Android tablets probably have no hope and Nexus continues to be the only real choice.
Simple. Without a Linux kernel, you can't make a ROM that works with modern Android, and you need binary blobs for updating those ARM kernels. Snapdragon and Mediatek don't maintain those blobs that are basically needed for newer Android versions, so OEM's buy the new stuff...
On the other hand, X86 platform doesn't need those binary blobs, it just needs vanilla Linux, and that's it.
Can someone explain how come Intel didn't manage to overcome ARM and win the mobile SoC industry? It sounds weird given their insane amount of experience with processors, amount of work power and sources available.
I can't seem to edit..anyway, if they do anything in their power- they can't make a SoC which will be at least as good or better then current top SoC's, like SD820 and Apple SoC's?
They never really put much priority on it. Server, Desktop, Laptop CPU's were making them several billion in profit per quarter, they just never prioritized making an industry leading mobile chip. If they had prioritized it, they would have at the very least been doing the latest manufacturing node on mobile first, or at the same time... Instead, 1-2 years after the desktop/laptop/server CPU's came out on .32, .22, .14 process, came the comparable Atom. Kind of like Microsoft and Windows Mobile all those years. Half assing it all the way.
How does it make sense for Intel to not invest and prioritize in the mobile SoC industry if there are so many smartphones today? Only the android ecosystem has over 1Billion devices. I don't get it.
1 billion devices making Intel a net profit of 3-5$ is terrible when they could invest their design resources into the datacenter side of things and make billions per quarter on far less chips. Smartphone SoCs are simply not very lucrative because of how low cost they are, irrelevant how many they sell.
-- There were rumblings of ARM entering the datacenter at some point.
the problem for ARM and allies is that while one might argue that the real silicon engine in an Intel iWhatever cpu is just a dirt simple RISC machine like ARM's ISA, the assembler level semantics (what compilers target), it's still a CISC machine. IOW, a gaggle of ARM cpu might look like a drop in replacement for an iWhatever, but only for embarrassingly parallel problems, of which there aren't many.
Your claim ("the real silicon engine in an Intel iWhatever cpu is just a dirt simple RISC machine like ARM's ISA") is misleading. x86 as a compiler/OS target consists of much more than just the instruction set; it also includes things like the OS model (SMM, PAE, interrupt model, hypervisor stuff, etc) and the memory ordering model. It is THESE things, not cracking complex instructions into simpler instructions, that represents the true x86 complexity.
-- It is THESE things, ... that represents the true x86 complexity.
so, what's you're point?: 1 - a gaggle of ARM cpu can do what the X86 does vis-a-vis datacenter since it doesn't have all that cruft, so ARM will win 2 - datacenter is mostly embarrassingly parallel, so ARM will win 3 - all the extra cruft stuffed into iWhatever silicon is needed for datacenter, so Intel will win
No dude, the x86 complexity is ENTIRELY in the front end, in the decoder. (SMM, PAE, interrupt model, hypervisor stuff, etc) are all things you need to consider in ANY arch, not just x86, or even ARM.
Why nit give the 15 % the option to take a paycut til business picks up? Otherwise you still may be supporting them in the form of foodstamps, medicaid, and welfare.
One issue is take home pay is maybe half (?) the cost of an employee (consider healthcare, 401K/pension, basic overhead, etc.) to a company like Intel. A 20% cut in take home pay is not a 20% cut in the cost to Intel to keep some one employed. Some of these employees will get "voluntary severance"/early retirement.
If you could convince every one in the company to take a 15% paycut, that may be an option. But, then you start to lose your high performers to other companies, etc.
As someone who works in the mobile side of the SoC market in manufacturing, I can tell you that the margins are slim. Unless you have a noticeable market share advantage profits can be pretty sensitive to minor market disturbances. Intel's business model requires high profit margins due to their research heavy focus. Why spend $10B on low power SoC R&D to eke out $1B profit over investment costs (assuming you can compete against current ARM producers) when the same $10B into data center, 5G, etc. can net them a guaranteed $5-6B in profit over investment?
Bingo, ARM is basically a very scalable and customizable architecture. Anyone can license it and customize it any way they want and that allows it to be extremely adaptable and very low cost...
While Intel's x86 is one size fits all... making it not very customizable at all and while not expensive, still not as cheap as ARM can be... Thus the need to subsidize in order to compete and that's not ever counting how hard it is to gain market share by one already dominated by ARM...
Add, why would they? It's not like they had a urgent need to run software that you'd need a x86 processor to run and initially popular mobile OS like Android had issue with compatibility with all apps on x86 devices that still left a stigma even though it's 99% not a issue anymore.
There are Android compilations on x86. Any dualboot Windows/Android tablet runs Androidx86. Even without optimizations it has superior Sunspider performance to most of the ARM competition.
Long story short. Intel already has a competitive mobile CPU for everything BUT phones. Braswell is totally competitive in 8"+ tablets where you can get a 5000+mah battery. They just ran the numbers and found at the price they've been selling these chips at, to scale their power down would take more R&D and manufacturing than the additional margins of selling a smartphone variant would cover.
And when they saw those numbers, they pulled the plug. Don't blame them. The smartphone market is saturated as hell and it's becoming less profitable for everybody (even Apple) and Intel will never get Apple onboard with an x86 CPU. They'll be lucky to get many Android OEM's, especially Samsung or those in bed with Qualcomm. Even if they got the WinMo market, it wouldn't matter because less than 4 million Windows Phones were sold last year. Put into perspective, Apple sold more WATCHES than Microsoft sold PHONES.
Intel is right to bail on this market. It isn't going to make them money. Not "Intel" levels of money.
There have been some design wins with Atom in Android, such as the Samsung Galaxy Tab 3 10.1. - And for the most part you wouldn't have known it was an Intel chip.
The ecosystem with volume is the ecosysystem that wins in the end. Intel has relegated itself to the position of IBM --- they can continue to sell big expensive systems, but they'll become increasingly marginal to mainstream computing.
Like I said above, smartphone SoC revenues (when properly accounted for) likely already exceed Intel revenues. Meanwhile smartwatches are about to take off, and we'll be getting more and more small CPUs in everything --- scales, dashcams, headphones and speakers, VR/Ar headset, cars etc etc. In time this market will probably expand to the size of the smartphone market --- each item is cheap, sure, but you'll own 20 or 30 items each with a small CPU and a small radio in them. And Intel will have no relevance whatsoever to this market...
"each item is cheap, sure, but you'll own 20 or 30 items each with a small CPU and a small radio in them. And Intel will have no relevance whatsoever to this market..."
Except all the servers in all the datacenters that power all those devices run Intel CPUs. Without the cloud, those "smart" devices aren't very smart. The price and profit margins on those server/datacenter CPUs is a lot higher than the $6 SoC in your smartwatch.
You can sell a mobile chip with every toaster, while you can only sell so much chips to desktops, servers and laptops.. Intel is actually a small player counting all chips sold.
Yes, but Intel's margins are ridiculously higher than anybody. Only Samsung could perhaps have the same margins and that's because they use it through vertical integration, so there is no marketing costs, etc.
It's safe to say for every $200 Core i5 sold, Qualcomm must sell TWENTY $34 808 SoC's to make the same profit, because there is so much ARM competition that they probably make $5 per chip, and Intel makes $120 (60%)
As smartphone sales start to slow down, this is especially a market Intel doesn't care to be in. Likewise, data centers are being built at record pace, and 5G towers are going to be going up everywhere in the next 2-3 years, so of course Intel is now focusing solely on Xeon and modems.
I wonder if they won't lose in datacenters as well though. What if Deep learning technology will be in most data centers and then probably use Nvidia hardware.
"Intel is actually a small player counting all chips sold". Perhaps...but a company's stock doesn't ride on raw number sold. It is based on profit. And a small number of relatively expensive Xeon server CPU's is orders of magnitude more profitable than huge numbers of $3 phone CPU's.
Some may wonder why Intel was so late to the game. I take the other approach: Why did the bother? They have a market where they are number 1 and likely to stay that way for a long long time i.e. laptops and desktops (admittedly a flat market now but phones and tablets are headed to the same thing) and servers (a big growth area). Getting into phones and tablets was always a distraction from their core business in my book that they shouldn't have bothered with in the first place.
That said, I've got a Z3580 based tablet where I can't tell the difference in performance vs my Galaxy Note 5 phone (yes, Ive seen the benchmarks but they don't matter for my use cases) and if nobody told me it was an Atom CPU in there and I just tried it out? I would never have known...nor would I have cared. So they reached parity but just did it too late and in the end this isnt the business they should be in in the first place.
what is the battery performance like in the Z3580? does it match up with ARM? I think the reason Intel is late is their focus had been on MIPS and ARM was on efficiency.... they likely looked ahead at a market that was cresting and decided it wasn't in their best interest.
I do wonder about their IOT angle though. Isnt that the same performance/efficiency curve that smartphones are on?
TSMC's revenue is about half Intel's revenue. That's just the TSMC part, there's also (in that revenue stream) the Rockchip, Mediatek, QC, and nV parts. Then there's also the Samsung side of the equation. And the imputed revenue on Apple's side.
I'd bet the total is already larger than Intel's total revenue...
@name99: "TSMC's revenue is about half Intel's revenue. That's just the TSMC part, there's also (in that revenue stream) the Rockchip, Mediatek, QC, and nV parts."
TSMC is a fabrication house. Intel doesn't exactly make much money on fabrication. The revenue TSMC makes comes from the Rockchip, Mediatek, QC, nV, AMD, etc., so its more of a negative for ARM profit. Also, TSMC does more than a little bit of business that has nothing to do with ARM chips.
There are an awful lot of sandwiches sold today, and nonetheless Intel is not investing or prioritising in the sandwich market. The year to be selling smartphones was 2014; it's now basically a replace-as-they-break market, much more cost-driven than it was.
Like I said, they were making billions in profit every quarter on desktop/laptop/server chips... Also, the mobile market was crowded. Also, it was probably a mistake, but that is how it may have seemed to make sense. Call the same thing for MS. From the early 2000's until 2012, how did it make sense for Microsoft to put in a "half assed" effort on a Smartphone OS? Bad choice in hindsight. By the time Windows Phone was actually pretty good, no-one cared to switch.
Actually, they did put priority on it... The ATOM went from a 5 year product cycle to a two year cycle like the Core series... The ATOM also rapidly went on the newest FAB, there was just delays with the 14nm FAB in general that put Intel behind schedule but they still got Cherry Trail out within a year of Broadwell and Cherry Trail even used a Intel gen 8 based GPU like Broadwell's...
The problem is ARM dominates the mobile market, is a cheaper technology for OEMs to invest in, OEMs can customize ARM in ways they can't Intel SoCs, and it was basically a hard sell for Intel without any clear advantage, which they couldn't offer without going to their much more pricey Core series technology product...
Sure, profit margins played a role and they also didn't want to give up their profit margins for their higher end products but the ATOM hasn't been a gimped or low priority product for over three years now... It just hasn't been successful enough in the mobile market versus ARM, which is especially doing well in GPU performance, which has started to rival even some of Intel's higher end iGPUs...
"The ATOM went from a 5 year product cycle to a two year cycle like the Core series" But...Apple, Qualcomm and Samsung come out with a new generation every year. Of course I expect this to grind its way down as phones and tablets reach "good enough" and people start losing the itch for constant "upgrades"
No, when referring to Intel's 2 year product cycle it is referring to an actual cycle between their FAB advances... Meaning everything that happens between one FAB advance to the next, which includes yearly updates...
For Intel this usually falls into what is known as the Tick-Tock cycle... The Tick being the FAB advance, which offers the usually smaller product improvement that is then followed by the Tock, which offers the better architecture advancement that improves the product on a given FAB...
Intel previous held a FAB lead because they rapidly advanced to the next FAB... Only recently has ARM managed to close this gap... It helped that Intel has had issues with their FAB and thus had to temporarily extend their cycles to 2.5 years and we may not see the 10nm FAB released from them for close to another two years.
Besides, aside from the switch to 64bit ARM hasn't seen a really big update in awhile...
So no, it wasn't a issue of SoC updates with the competition... Though, you're not entirely off because Intel wasn't updating what went into a typical system as rapidly... Things like 4K cameras, faster storage, etc. are what Intel was really falling behind on.
The advantage to ARM is it is extremely easy to customize and scale... While Intel products was whole package or else... The new Goldmont update would have finally changed this but it was basically too little too late on that score and products are more than just how good the processor are anymore... The entire product ecosystem is what matters in mobile...
Even companies like Nvidia had to give up phones because of this, though that doesn't mean they can't find other niches they better fill... Like Nvidia is doing very well in cars right now...
It remains to be seen if Intel can make a go at IoT, either in the actual products or in the infrastructure that supports those products, such as their new emphasis on 5G...
The long answer: Intel could absolutely build a killer smartphone SoC on the 14nm process. However it would need to be Core based; the Atom CPU line isn't performant enough. They may need to go back to Ivy Bridge or strip down Skylake a bit to get there, but they could do it.
The issue is that smartphone SoCs have very low margins. MediaTek, Rockchip, etc are crushing the market. Atom is designed for margins, not performance. If Intel built a winning Core smartphone SoC, then they'd have to sell it for significantly lower margins than what their other Core parts sell for. And that in turn would put pressure on the rest of their chip stack; why should an OEM pay $250 for Core-M when you could get a similar smartphone SoC for $30?
For what it's worth Apple faces a similar challenge. But being vertically integrated means they don't have to share with others, and the total profit off of iPhones/iPads can offset the higher costs involved in developing and fabbing the A-series SoCs. However Intel doesn't have that luxury, and Asus would never let Intel have all of the profit.
Smartphone SoCs are a race to the bottom. And Intel as a business has nothing to gain from taking part in that race. The winner is the guy who got badly hurt but didn't die; a Pyrrhic victory.
I agree with this. Intel has been adverse to getting into markets that become commoditized. How many smartphone parts do they have to sell to get the same margin they get from a single sale of server-class CPUs that cost thousands?
However, the margins in smartphones were crap two years ago as well. Why did Intel keep throwing more money at it as long as they did if margins are and were their ultimate goal?
Probably due to the fact that margins are dropping in their PC market and possibly in DCG, as well, based on two quarters of lower ASPs. These "core" segments can no longer provides Intel with sufficient margins to offset this ill advised attempt to be a player in the low end.
They better hope that their FPGA business picks up to offset the major drop in volume in their fabs from a crashing PC market and now non-existent phone processors. With all the fixed costs associated with fabs it is also a margin killer to run fabs at low utilization rates.
I wouldn't be surprised if Intel has already started cutting shifts at some of their fabs starting this quarter. They have a lot of inventory to burn off as it seems they chose to keep the fabs running through Q4 and Q1 holding out hope for a rebound in the PC market that hasn't seemed to arrive yet.
As far as FPGAs are concerned, they missed their guidance in Q1 by over 10% on their new Altera business. It's early, but if they are that off on the guidance how far off were they on the proper amount to spend on Altera in the first place.
Some companies saw this writing on the wall earlier, some later. The Apple report makes it official: smartphones have peaked and are entering the commodity phase. Nokia got out of phones when they could, and they aren't coming back. Microsoft realized a couple of steps later. TI and Nvidia, longer ago. Now it's going to be about IoT and cloud, with a few side areas like cars. The big players will acquire, partially adapt, and survive, but the real winners will be those few companies that really get it first, and can execute to perfection.
Personally I think Atom as it is, is completely competitive with most ARM configurations, especially the mainstream Qualcomm 808 and Exynos 5000. It rivals them in most non-GPU tasks, but obviously in tablets. In smartphones, where it doesn't even exist, it'd be too hot and too power hungry. The R&D to fix this outweighs the possible returns of selling low-margin chips. But the R&D would only be in further shrinking Atom down, not scaling Core to be an SoC. Making Core, even the X3 variant, less than 4 watts with a full onboard modem would be almost impossible without some huge compromises and billions of dollars.
But if it only rivals them CPU wise, looses GPU wise, and runs hotter/pulls more power, how is it competitive?
When it comes to something like a tablet, battery life is very important, and that is something the ARM market has in the bag. tablet class atoms just use too much juice. Why intel didnt make an arm based atom is beyond me.
No, battery life wasn't a issue and neither was heat... It was just margins, GPU, and device hardware ecosystem.
But changing to ARM wouldn't have changed enough to matter for Intel... Their expertise is in their technology... Just look at Nvidia, they tried to get into the phone market too and failed. Nvidia just managed to find niches that they do do well in but those niches are not the same as everyone else but in areas they have expertise in... When companies forget this is when they fail...
It is more complex than that. You are correct but in a different way. In the mid-range to high-end is still lucrative enough for Intel to care. Atom's performance and size isn't for low-end anyway. They could have went with a big Core chip but that it will be at a higher performance level of Apple's chips and no Android can sell at that price niche and is overkill for an Android operating system.
Intel cannot R&D the Atom fast and cheap enough to compete with ARM without resorting to brute forcing it by their manufacturing might. It is probably priorities as they can't serve two masters at the same time. See, if they release latest process node, Server/Enterprise products will get the bulk or all of it for the first few months to one year as they pay any money for it just to be on the cutting edge. This is hard to refuse as their long and loyal customers are paying anyway, leaving no capacity for Atoms for a while.
Intel wouldn't care losing profit here as it is more important to get themselves in the smartphone/device ecosystem and have a share of it. This is the reason why they to have to contra-revenue Atoms and hide this from investors, negating the higher margins of the PC/Server products.
OEMs don't want the Atoms despite being paid to use it due to inferior performance and other smaller reasons versus an equivalent ARM. It is a market they couldn't get into and other chip manufacturers are closing in, which I believe led Intel to leave the device SoC market for now.
MS also has a part here as they should have a Windows Mobile OS based on Windows PC x86-x64 as it is faster and easier to deliver and more importantly, app availability. This I believe has a better chance in the smartphone/tablet market and will require the use of Intel's chips.
I agree. Microsoft fucked Intel with Windows RT, not the other way around. Intel MADE the Atom for Microsoft, but Microsoft didn't respond with any interest. The closest they came to a lightweight version of Window was 7 Starter Edition which was just a neutered shell, not an optimized kernel.
Then Microsoft bought Nokia, a huge F U to Intel, because of Nokia's decades long ties to Qualcomm.
I think after Intel waited around for Microsoft to introduced what we all thought would be an x86 compilation of Windows 10 for mobile, not an ARM port like Windows Mobile always has been, they got sick of waiting, saw the sales number of the WP platform, figured Apple and Samsung wasn't going to ditch ARM since they already have their own in-house development, and really, Intel wants no part of the Android phone market because there is SO much competition they will literally be racing to the bottom.
Right, Windows RT is not a good sign for Intel. Windows RT was Microsoft's backup plan if ARM gets a lot better and Intel don't step up their mobile chip offerings. Unfortunately for MS, ARM did not become significantly better and Intel did produce some decent SoCs, albeit, quite late to market.
Many articles mention Windows RT as a flop but knowing Windows 10 Mobile to be only available in the ARM architecture, to me it seems it is Windows RT with a different skin additional features. Someone please correct me if I'm wrong here.
This gave me an idea that another reason why Intel threw in the towel is Microsoft not going with x86 for Windows Mobile anymore. MS probably thinks having devices as full fledged Windows PCs will hurt the PC industry even more. It will be for the short term but having users sucked into Windows world is better than them preferring Android/Google or Apple.
Do we now use Margins and Markup Interchangeably? Because what 300% Margin you mean implies a $1 cost of goods selling for $4. Before Margin were used for dividing, where 50% margin implies dividing by 0.5, ending up with markup prices of 100%.
Both Intel and ARM SOC builders had the same target, but ARM was going at it from below (improving performance with each generation), while Intel was coming from above (trying to drive down power). I think it was a matter of time before both companies were on a level playing field in terms of performance, but by the time we got there, ARM based chips have dominated the market. Intel has no advantage, and honestly I bet most OEMs are happy to not have to deal with Intel for chips for a change. I can't say that I'm sad, as Intel dropped the ball when they didn't see this coming.
Qualcomm is basically the Intel counterpart for ARM outside of China and as we saw last year with Snapdragon 810, they completely dropped the ball for a generation of chips and being forced to use the generic ARM design left them with an underperforming and hot chip for the higher end of the market. None of those OEMs were happy with what they got but there was no viable alternative for the market.
They didn't support the ARM instruction set that everyone else was using, they purposefully gimped chips to artificially separate models (like they do with PC CPUs), no one else does that.
^ Correct and they sold off all their own ARM investments and chips designed on them before Apple came out with the iPhone. They shot themselves a world of hurt. Apple will soon have no desire to be tethered to Intel by any way shape or form.
Intel will be a supply vendor like the rest of them, at Apple's discretion.
Margins. You can't get >50% margins in that space. You can see that intel crippled Atom both on CPU and GPU side but especially the later. More GPU means bigger die means less money for intel.
Second point i've heard is that their whole company structure from design to fab just isn't suited for fast iterations required in the smartphone market.
Company incentives (read: profits) and an unwilling to switch to ARM for mobile. Despite what most others say, ARM is still more efficient in mobile BUT when put on a similar process node as well.
Ever since Intel entered the server market their business model has been based on their ability to sell expensive server chips that had their research and development funded by a massive desktop population.
They've used every trick in the bag to ensure those user groups stayed as separated as possible, so server users couldn't actually use desktop CPUs instead and destroy their margin.
That doesn't work in the other direction, but most importantly, keeping desktop users away from the mobile chips at ARM prices is becoming increasingly difficult: What's the benefit of selling millions of additional mobile SoC if each means a desktop CPU less?
After running RemixOS (Android x86-64) on a €100 N3700 motherboard (no moving parts) I have to conclude that anything more expensive requires special needs (workstation/heavy duty gaming) to justify as a PC, laptop or HDMI stick.
Even Windows is ok, unless your other PC happens to be a 4GHz i7 (where the change of pace becomes noticeable).
Intel is becoming a niche player. They have finally realized it and we can be sure that they'll continue to fight hard and dirty to make sure that niche stays as large and as profitable as they can make it.
There is one camp that says this is all economics. Intel was afraid to create good chips in this space because those would compete with their mid-range. I don't buy this. If Intel had though that way, there were MANY different ways to avoid the problem. (For example they could have shipped phone CPUs that were x86-64 only --- able to work with modern compilers and dev tools, but unable to run any older Windows software.)
I think the issue is that same way that x86 fans constantly refuse to admit. The complexity of the x86 ISA (broadly considered; not just the instructions but also the accumulated supervisor details like PAE and SMM, the complicated hardware expectations, and the memory model) is what did them in.
Creating one of their CPUs requires twice as long and vastly more people than on the ARM side. At first they wouldn't believe this and didn't budget enough people and enough time to maintain pace with ARM. By the time they did realize this, it was basically game over. They just could not move as fast as ARM, and once they lost process advantage the future was obvious.
If they could churn out competitive CPUs at the pace (and cost) of Apple or QC, they'd be fine. But complexity costs. They refused to accept that ten years ago, and today we see the consequences.
(And so much for their "We will power the IoT" crap. Yeah, you'll power it with what? Your stupid Quark chip that no-one's interested in and your non-existent 5G modem? Quark shows exactly the same stupidity as Atom, and will fail for exactly the same reasons. By the time Intel have finally shipped a version that matches what smartwatch vendors wanted in 2015, the ARM eco-system will have shipped three impressive updates to their various SoCs.)
ARM is, IMHO, in a fundamentally better place than Intel because they're willing to accept some short term pain in order to tame complexity. They've changed ISA a few times and they've been willing to abandon mistakes or obsolete ideas (like Jazelle). Right now they have to carry the burden of both the v7 (including Thumb) and v8 ISAs (and some hassle related to memory models and the older interrupt model), but we already have the server vendors saying they've dropped everything pre-v8, and I expect Apple will do so in the next year or three, followed by ARM itself. This is a fundamental in engineering --- complexity WILL kill you, unless you CONSTANTLY work to weed it out.
Intel hasn't had success with a new design since the Pentium, and that dates to 1993 or so. They managed to recover from the Itanium fiasco thanks to AMD, but that caused them to focus back on x86, while the world was changing.
Basically there are a few reasons. This is my opinion, at least. I think the main point as which they messed up is when the sold off their ARM IP to Marvell, and decided to bring a low-cost, low-power x86 arch in it's place. This created the problem that this chip could now potentially compete with the 'main' x86 arch lineup of chips (currently Core). That means the Atom core has always been artificially limited in terms of performance.
See, I think they they, instead, should have kept their custom ARM IP and continued to develop that, they could have easily been a market leader. There are several reasons for this, 1) They would be selling an ARM chip into an ARM market. MUCH easier than selling an x86 chip into an ARM market. 2) Since it was an ARM chip, it wouldn't really potentially compete with the main x86 arch, which would mean it would not have to artificially limited in terms of performance or capability, and 3) I have little doubt that Intel could have designed proprietary ARM IP cores that could compete in terms of performance and efficiency with the likes of Qualcomm, Samsung, and even Apple.
Because Intel got arrogant and lazy. They thought their process advantage was big enough that they could beat ARM with one hand tied behind their backs, artificially restricting performance to avoid eating into more lucrative Core sales. Turns out that ARM 1st tier products were more than a match for Intel 2nd tier products. And now that the foundries finally got FinFET right, Intel's process advantage is much less than it once was (probably about a half-node ahead at this point).
Had they taken this market seriously from the beginning, they might have won.
Intel had legacy issues. They were based on a x86 CISC design which requires a lot of transistors. This means there is a considerable power drain to maintain backwards compatibility with older x86 processors and code. So even though Intel chips used a smaller process node, the extra transistors ate up that advantage.
Add to this the fact, that Google didn't actively develop drivers specifically to use Intel chips. Intel invested a ton in programmers to create drivers for android. And quite a few of their products were less then steller in long term stability. (Ask someone who has experience with a few)
The added complexity, with lower profit margin, and need to develop in house drivers became a death nail for Atom in the long term. And Intel is LOATHE to compete in the lower profit RISC/ARM core.
I predict a long term shrink for Intel in the next 15 years of around ~50% market cap due to a number of factors. (People moving to tablets, lack of progress in any real speed improvements, and moore's law limitations) Server market and cloud is a correct strategy. But intel has a slow walk into home computer obsolescence in front of them.
AMD has already cancelled their cat CPUs. Now if Intel cancels Atom, it will leave a giant void in sub-$250 client computing devices world (desktop, laptop, chrome book, and even tablet).
Well, the technology has been getting more expensive with each node, while users' upgrade cycles have stretched out because the current generation is "good enough". That means there might not be enough revenue to support the R&D any more (unless they suddenly get another 3 billion users from developing countries or something). So, either the unit prices go up or the R&D slows down (14nm for 3 years rather than 2?) Doubling the lifetime of PCs is great for us, great for the environment, but *halves* the revenue of places like Intel. I guess what I'm saying is this might be an inevitable stage after the initial growth phase.
That is, 'the brand Atom for consumer desktops' may well be eliminated completely. There will still be cheaper chips branded Celeron containing a two-wide out-of-order microarchitecture, next to slightly more expensive chips branded Celeron containing a four-wide out-of-order microarchitecture.
What about Denverton, the successor to the Atom based Avoton? I hope they are not abandoning this too. the C2550 and C2750 chips have been great for a firewall/router/nas and it would have been really nice to see a successor.
I built a NAS/VM server out of a C2750, and it's pretty good, and supports ECC.
But honestly, even with 8 cores, it's just not fast enough. If I were to do it again, I'd get the lowest end xeon or i3 that has two fast cores and ECC support.
It's just that intel doesn't make it economical to get ECC and server type features.
I think what's dead is actually the idea of an x86 phone and with it Win32 apps. Windows Store serves 280M people and growing and the Universal Windows platform is gaining momentum. I doubt Microsoft was ever planning on an x86 powered phone.
For the original poster, I think MS was greedy enough and plan to provide consumers with two separate computing devices, one with the classic Windows PC and Windows mobile as it was the trend back then where people had a PC and a smartphone.
Now, people are happy without a PC because smartphones became so sophisticated that only serious content creators don't use. They should have started early on with a cut-down version of x86 Windows so that app availability is good already from day one.
"people are happy without a PC because smartphones became so sophisticated that only serious content creators don't use" PCs are going to be with us for a long time, simply because of the ergonomics and need to do get work done, and torrenting and light gaming (MOBA, other free games, RTS), what changes is that nowadays average teen-agers and children are okay with having just a 2 years old tablet and a smartphone and use those, they aren't going to be buying a big laptop or even a desktop like they traditionally did to cover their communication needs, time in front of a PC decreases massively if you eliminate these uses and one for the family is enough.
@osxandwindows : Only time will truly tell, but your reference to UWP apps as mobile suggests that you don't understand the architecture and where Windows is headed.
I'm a Windows developer and if I can write an app and push it into the Windows store that provides distribution for 280M+ customers for a 25% hit, why would I choose anything else unless I'm just writing a mobile app that doesn't need a "store".
I have written many Windows desktop / Win32 apps and have built custom installers and have distributed that software via CD-ROM or downloads and the UWP experience is much, much better.
"Those mobile apps won't ever replace desktop apps"
Looks like someone didn't play around with the new Windows XAML and the UWP. This platform is more than capable of replacing fully fledged desktop applications for most users. This isn't Android/iOS (nor Windows Phone) where conventional mouse and keyboard support isn't front and center. Those same UWP applications with UI optimized for mouse and keyboard can scale perfectly for touch and smaller form factors, in the same binary...
Oh, you want a good example? Take a good look at Word and Excel Mobile.
Sandbox, my friend. Why do you think people prefer the desktop version of adobe CC instead of, you know, universal version. Hell, lets go a step further and make all windows games universal. Just look at the dropbox app in the windows store.
Hell, lets go a step further and make all windows games universal. To elaborate on this, why not make windows games universal? Since we're already making "fully fledged desktop applications" lol, why not?
I worked on IA64, and many years later after that bombed asked the CEO directly at an employee forum how long they would stick with x86. It was clear they were not ready for big changes.
All Asus Zenfone releases in 2016 will move away from Intel to Qualcomm (higher end offerings) and MediaTek (lower end offerings). Not sure if there is any single new phones out there that will continue using outdated Intel chip.
When you flood the market with 10k already trained engineers in an already mediocre looking job opportunity field (look up DoL projections) then I don't think it really matters who is doing the scooping up who since it probably won't be me getting scooped up.
iwod, I salute you but would suggest the question should be "What will happen to all businesses?" Please see my later post, chortle at my naivety, and when your sides stop hurting, do say what you think....
"...Aicha Evans said that she wanted a big contract in 2016, otherwise we might not see her in 2017."
I'm confused. Aicha Evans was fired/resigned a few weeks ago, so wouldn't that suggest they are giving up on discrete modems? Did they actually tell you (Ian/Ryan) that discrete modems remain a focus?
While perhaps technically so, both WSJ and Bloomberg are reporting that she has already given notice of her plans to leave. I'd link the articles but don't know if that is against the rules and they are very easy to find.
Given that, it surprised me to see her commentary being used to support Intel's continued discrete modem ambitions.
This explains why ASUS divided latest ZenPhones to have both ARM and Intel models. And kind of explains why Microsoft did not make the new Lumias as x86 phones + Continuum.
But this news also means that Microsoft's smartphone numbers are now officialy only going to keep declining until they hit zero. My last hope for having usefull x86 device with full desktop in real small format - gone forever.
But I hope they don't kill Atom because that would mean that you can't get a tablet/2-in1 with Windows in 100-300$ range anymore!!! All those CHEAP stick PCs would be gone as well. So many alternative formats...gone... So really - hope they don't kill it. Really , I don't think they will either, as that lineup just started to pick up the pace finally, and with Wintel combo is finally pushing Android from those markets.
Look, there's much informed debate going on here, so I am reluctant to look stupid in asking a question that will seem barboursly naive at the very best. But since it's causing me a considerable amount of grief, I'll have to get over the social embarrassment, given this opportunity to become better informed. So, here goes!
Isn't this really the formal announcement of the death of 'Moore's Law of Computing'? Gordon Moore always said it would reach a natural conclusion, an end-point, and this is it. Accepting there are still performance gains to come, both from architecture tweaks and software efficency improvements, hasn't semiconductor technology come to the end of the road?
Since 1965 it has probably been one of the most important drivers of the economy and even today is constantly referred to as the force that will provide the perfomance gains required to power the future for just about anything from the Fourth Industrial Revolution up to national and global security.
There is much talk of the development of different forms of computing - quantum, DNA and so on - the list is endless, but seemingly no general consensus on what, when and by how much. There is certainly nothing in the pipeline that would allow for a seamless transition from one technology to another.
Yet few people mention this or appear concerned. Is this because it's just taken taken as a 'given', accepted by everyone and too depressing to talk about? I can think of many more options, but I won't list them, because I'd like to appeal to this community for their own informed views, for which I'd be very grateful and better informed by.
There is a very serious reason for me asking this. Over the last few years I've been a very junior member of a research project (basically, I buy, lick the stamps and post the letters of the clever people) that has had access to some of the cleverest people alive, which has resulted in a view of the future that is alarming different from what most people expect. One thing is certain - on this topic, nobody agrees, and most are unwilling to state their position due to insufficient information.
Actually, that's not strictly true. There is complete consensus on one scenario. Any talk of a technological singularity, in the terms in which it is generally understood, is ludicrous at best. In addition, the promoters of the idea are talked of using language ruder than I've ever heard used before.
I'd like to ask any member of this community to comment or reply in the knowledge that it serves a serious purpose. I should be clear that the ideas expressed in this post are not ones I hold personally, simply because I don't have the knowledge or experience required to do so. Please be as direct as you wish!
First of all, Moore's law is a little more complex than compute performance will double every x months. There are some important economical factors underneath, which are currently changing, and one of those is *direct* competition, which is no longer really happening. Instead we see an evolution into niches and each niche guarded by its own gorilla.
One of the cornerstones of Moore type progress was that the square increase in usable transistors due to a linear process density increase could be translated directly into computing power.
That computing power gain curve has hit serious trouble with things like the Gigahertz wall and the resulting core number explosion hasn't resulted in linear gains for many consumer workloads.
It's probably more appropriate to say that Moore's law is running out of steam, especially as the cost of shrinking is rising sharper than the compute power gains (e.g. ELV vs. multi-patterning/multimasking).
But what's coming to an end before Moore's law is general purpose compute, one architecture for all jobs, which is what x86 stands for most.
We've seen this a bit with GPUs taking a huge chunk out of general purpose CPUs both in games and HPC and I believe we are going to see it also at the other side with memory intensive workloads, where transporting data to the CPU for processing is starting to use more energy than the processing itself.
The logical escape for those compute scenarios is to move some of the compute towards the RAM itself and take advantage of the vast parallelism available within those row buffers.
But you won't see Intel pushing that just as you didn't see Intel pushing non x86 GPUs until it was about hurting AMD and Nvidia (today more than 50% of a desktop "CPUs" silicon real-estate is actually "GPU contra revenue" only targeted at keeping the competition starved from revenue).
Intel wants an effective monopoly to maintain their margins and they are fighting hard and dirty to achieve that. Their natural escape valve, the one they have used for decades, faster general purpose CPUs, is getting both more difficult to build and less and less successful in creating value, because special purpose architectures made possible by web scale companies like Amazon, Google and Facebook (and the ARM mobile space), byte (intentionally misspelled ;-) off large chunks of their general purpose CPU slice.
So do not mistake Intel's trouble as a sign that compute will stop to evolve further.
It will just evolve in places an in ways which are less Intel and there is some inertia as the entire IT industry will have to turn to new architectures.
The x86 may well become another mainframe, but won't be the one and only architecture Intel wanted it to be, including graphics (remember the Larrabee?) or including the IoT space (remember the Intel Quark?)
Good point. Most of the increase in the computing power of Intel desktop chips over the past few years has been in the GPU. So Moore's law is still operating, but not in a way that helps people who are going to buy a discrete GPU in any case.
The materials and technology path forward in computing power to keep the same pace as what "Moore's Law" has yielded over the last 40+ years is a bit difficult even over the next 10 years, let alone almost half a decade. This is simply due to the laws of physics that Moore himself knew all those years ago. Hence, 'the natural conclusion' you refer to.
So what are the challenges? Well, there are 2 major ones. The first is the light source to enable the lithography. EUV is coming along, but the development has not been difficult. Although the research world has options such as e-beam writers and synchotrons, these techniques are not 'fast' techniques relative to current UV light sources and production. They are also expensive. Hence the reason that companies are pushing current UV light sources to the limit by doing things such as double, triple, quadruple exposure/patterning. But each patterning adds what I am estimating as 10 steps (adhesion promoter HMDS, bake, spin photoresist, bake, expose to light, bake, developer, rinse, dry, quality control check) which adds significant time, cost, and increased difficulty for the final product due to alignment issues. Plus, the extra mask costs and development time. Multiply this out for a few of the bottom layers in device and it's no easy task to make things work. Hence the R&D costs and years of hard work by all of the materials scientists, engineers, and staff behind the scenes before it even goes to production line people and their worry about yields, defects, ...
The other major challenge is in the materials side of making a transistor. My take is FinFETs were developed as a means of lowering the leakage rate of electrons across the device (hence reducing the 'Off' state power) by making the gate length equal to a larger node process. But silicon itself has some fundamental limits to it and hence the development of III-V materials on large silicon wafers because even 40 years ago, people knew GaAs was faster (higher electron mobility) than silicon. Further, the recent interest in 2-D materials (WS2, WSe2, MoSe2, graphene, ...) as a means of being able to conduct an electron from one place to another quickly and at lower power. But all of these options take time to develop. Also keep in mind the solution must be able to be scaled to manufacturing speeds and be reproducible beyond normal comprehension to make a billion transistor device at a 100 million per year. One thing the semiconductor industry has done correctly is to make standards (e.g., 300 mm wafers as the defined unit) so multiple companies have a direction to make a machine that works interchangeably with others in the overall manufacturing production line. Compare that to the numerous starters, battery sizes, tires, rims, etc of the automotive world.
So is "Moore's Law" dead? Well, yes if "Moore's Law" is doubling transistors every node jump every 24 months. Node jumps are getting fewer and fewer. Look at any roadmap. But will 'performance' increase with time? Sure. There is a need in terms of computing power per unit energy input. It just won't be as fast and as easy.
As for the rest, sounds like you work at an interesting place!
End of Moore's law for Intel but not for computing as a whole. Actually, it is Intel or consumers decision to drive Moore's law. As long there is no urgent need for compute power, Moore's law will be violated. Yet, if there's an application that will utilize it, Moore's law might be exceeded.
If I'm right, Intel is preparing for a big change and focus for that new industry they are going into and they haven't left a clue yet. I think this is the case due to the massive changes that is not related to the revenues.
Moore's Law ended years ago. It went from 18 months to 19 months. Then it went from 19 months to 20 months. No one can really say exactly when this happened because it has been happening so gradually.
Any thought to the notion that intel gambled on Microsoft's windows mobile OS getting better traction by now?
It seems to me there has been and remains a concerted effort to make your phone a very miniaturized PC running a real windows kernel. The newest high end Lumia 950 is a peek at the first layer of that onion being peeled with it's clumsy wired plug display setup .
I always kinda saw that merger happening at a certain convergence in technologies - say, about where 7nm core that does what a Skylake i5 does now along with a 'Thunderbolt III 'AIR" ' level bus where you just walk in , drop your phone on your wireless charge pad and instantly your big screen lights up along with your Kinect and Cortana and whatever their VR setup but you don't have to wear a whole helmet - just an earpiece with a little bar like a mic but at eye level that projects across your eyes.
By about 2020 -2022 Windows has AI and now you're in Tony Stark's lab with JARVIS
All in all, I think Intel is (sort of) making a mistake here. A little context first, though.
I believe what we're seeing is the simple, unavoidable fact that continuous growth (economic or otherwise) is a stupid myth perpetuated by capitalist ideologues lacking contact with reality. We've long since reached a point where the most popular computing devices are "good enough" for >90% of use cases. The cut-off for this has been gradually postponed as new and more practical/accessible form factors have been made possible (the transition from desktop pc -> laptop -> smartphone), but we're running out of "growth opportunities" there as well. The next logical step is smaller devices that can fulfill all the roles of the larger ones (as with Continuum and similar concepts), so that people have one device instead of two or three, but this will again lead to lower hardware sales, not higher. The same goes for performance - as people are more or less happy with smartphone performance now, they don't see the need to upgrade annually any longer. Crossover/all-in-one devices has little potential to mitigate this. To wit: the largest cell operator here in Norway has reported that two years ago, the average smartphone replacement rate was every 14 months. Now, it's every 27 months. In other words, those who bought new phones annually two years ago have since stopped upgrading.
This is unavoidable - and it's a good thing. Electronics waste and overproduction in unregulated countries is wreaking havoc on our planet, doing far more damage than we know. Reduced production, consumption and replacement is good for everyone except bankers and investors in tech companies. On the other hand, this inevitable decline will lead to less profit in the tech sector, and thus less money to spend on R&D - which will slow things down even more. If the companies are interested in surviving at all, they need to shrink investor payouts and allocate a larger portion of profits to future R&D, to avoid a vicious spiral. Investors will be pissed, but that's their modus operandi - they'd rather have huge payouts one year and then see the company fail the next year than have a steady, lower income for years on end.
Tech is inevitably trending towards a commodity economy, with small margins - sure, it's a necessity, but a necessity that to a huge degree is fulfilled by low-cost devices. Thus, cancelling low-cost platforms is a dumb move. However, I don't believe Atom is the right fit for this going forward. As it stands today, the mobile Atom platforms are more akin to the A53s than the A72s or Twisters of the world. It doesn't scale well enough to be future proof. And Core m is too power hungry, and throttles to stupidly low frequencies under sustained loads. Intel needs an in-between architecture, tailored for the 2-10W range, with a smaller die size than Core m. Apple has absolutely nailed this, demonstrated beautifully by them being the only SOC manufacturer able to produce something that didn't overheat on 20nm yet still performed on par with or above competitors. With A9(X), they showed how scalable this architecture was on a properly working process node. I'd love to see an A9X in a Surface Pro-like package (with the passive cooling solution used for the Core m3 version) and a more lenient power limit - that would be an interesting tablet.
Oh, and Intel needs to implement some sort of per-core sleep function (like Android ARM SOCs have), otherwise they'll never be able to scale in the way future devices will require.
Is there any reason why Intel just doesn't get an ARM license and leverage its superior fab process that way? They could potentially even win some fab business from Apple.
Intel used to have an arm license, they sold it because they couldn't make any money selling super low cost arm parts.
That would be even more true today when they'd have to compete with Samsung, Qualcomm, mediatek. Intel is in the business to make money, and they've learned the hard way that selling $200 desktop chips makes a lot more money then $10 parts for asus zenphones.
Intel's fabs are not general purpose like TSMC or GF. They are highly customized and produced in lockstep with new CPU designs. They can't simply fab something else at will. This has given them a tremendous edge over time, they consistently stay 1-2 process nodes ahead of the competition. The downside is that they do not have the flexibility to fab just anything they want.
I don't get it. I feel like EVERYONE is missing the point here, and panicking over the "death of Atom" when no such thing is happening.
Atom has evolved to include two very different lines: The "lite" model and the "full" model. The "lite" model was focused on smartphones, and included PowerVR or Mali graphics. This is a very specific flavor of Atom, and it focused on extreme power savings--and to get there, it used third-party GPUs. Intel killed its plans for Goldmont-based SoFIA chips, which killed Goldmont-based smartphone parts, but also killed off the future of Atoms with third-party graphics.
However, Atom is NOT dead, and the "full" model is clearly alive. Intel has already clarified that it will continue shipping Atom x5 and x7 SOCs based on Cherry Trail. They also just announced Apollo Lake, the Goldmont-based Braswell replacement geared toward "netbooks" and "cloudbooks", as well as "2-in-1s" (which are basically premium tablets). Apollo Lake is designed to be modular and compact; Intel is heavily pitching how thin and light it is using soldered-down LPDDR RAM, eMMC, and Intel 802.11ac Wi-Fi. And again, no external GPU license, it's all Intel in-house.
This looks like a strategy we've seen a lot from Intel lately: They struggle with a new architecture, so they only roll it out in one place first, and keep making the prior-gen chips for everything else. Maybe this means Goldmont has a problem, and that problem has to do with achieving power savings over Airmont. So what does Intel do? Intel rolls out Goldmont first in higher-power devices, while it continues to make Airmont-based Atom x5 and x7 CPUs for now. A future refresh could bring new "Apollo Lake-T" replacements for tablets down the road.
So where did this leave Broxton? Broxton was a major, aggressive redesign. It was supposed to achieve power savings from Goldmont, and the utility of a modular "chassis" design that allowed easy integration of new features, including third-party IP. But what external IP would Intel want to integrate at this point? Intel has mature in-house 4G/LTE modems now, killing SoFIA puts an end to Intel's external graphics licensing (and Broxton was supposed to use Intel HD graphics regardless), so this looks like Intel abandoning third-party IP integration in its CPU/SoC development. The future for Intel will be aggressively developing its own graphics (which the Core team is already doing) and modems (see the aggressive 5G/connectivity push). Intel will want even tighter integration of these in-house components than a generalized modular design can deliver, so I don't see the point of Broxton anymore. Broxton isn't low-power enough to replace SoFIA, today it would be redundant to an Apollo Lake-T part, and tomorrow doesn't look like it's bringing third-party IP anymore.
Intel didn't say it's just giving up the mobile market. Their full statement says the opposite; it says that they'll try "to drive more profitable mobile and PC businesses." All of this refocuses Intel on using its in-house GPUs and moving away from third-party licenses. So what is Intel really abandoning here? They're not abandoning the mobile market. They're abandoning PowerVR GPUs and third-party IP integration. They're also abandoning out-licensing to Rockchip and Spreadtrum, which means keeping their in-house IP in-house.
So does Intel intend to just throw away SoFIA? There's a lingering question from the big 5G/connectivity push that Intel announced in February of this year. Alongside their 5G tease and some new 4G/LTE modems, they rolled out a brand-new Atom x3 part. This wasn't a Goldmont-based SoFIA part, and it's not identified anywhere as a "SoFIA" Part. The Atom x3-M7272 is a quad-core 64-bit Atom SoC with an integrated LTE/3G/2G modem for use in IoT, automotive, and embedded devices. It uses LPDDR2/LPDDR3 and the feature-set otherwise overlaps with the x3-C3445, so it was obviously intended to be based on the Airmont SoFIA LTE, the only SoFIA chips to be fabbed in-house at Intel. But the M7272 has no GPU; this isn't for infotainment, it's promoted for automotive connectivity. It was announced with a "long life availability & support program", so Intel clearly planned to keep making it for a very long time. It seems really odd that Intel would kill SoFIA to focus on IoT and connectivity solutions, but in the process, kill a key IoT solution it announced just two months ago. However, the M7272 isn't identified anywhere on Intel's website as a "SoFIA LTE" part, even though the C3405 and C3445 are.
With the SoFIA 3G parts cancelled, it doesn't make much sense to launch just the C3405 and C3445 as an incomplete x3 product line. But the only thing truly unique about the x3, that isn't already supported by other Intel teams, was the third-party graphics. The rest of the SoC, including the CPU and integrated modem, are all current-gen Intel IP. It doesn't take much to just disable the GPU and crank out IoT Atoms as a fully supported, short-term solution.
Given all that, here's my predictions for Intel's near future:
1) Intel goes ahead and rolls out the M7272 "v1.0" using a SoFIA LTE chip with the GPU disabled. They're 98% there already, and they need it as their current-gen IoT solution.
2) Reallocated personnel are used to fast-track a previously unannounced Atom core with an integrated LTE modem, IoT utility, and no third-party IP. This would be a die shrink of SoFIA LTE, except that it would replace the Mali GPU with a bare minimum number of Intel GPU EUs for applications that require video. Since this is all in-house IP, Intel already has experience building Airmont and Intel HD Graphics on smaller processes, and it would be fabbed internally, this should take far fewer resources than SoFIA is now.
3) Intel announces "Apollo Lake-T" for tablets and its new in-house SoFIA replacement for IoT, embedded, and maybe even high-end smartphone devices.
4) Once the SoFIA replacement is up and running, Intel quietly switches over to an M7272 "v2.0" that is a drop-in replacement for the M7272. Like v1.0, it's the low-end Atom chip with the GPU disabled, but this lets Intel stop making the v1.0 part. Intel can also launch new IoT/embedded parts based on this product, building it out into a larger product line as part of their IoT and connectivity platform.
What constitutes the IoT? I hear a lot of buzz about it, but I've yet to see anything outside of hobbyist boards, Nest thermostats, and the like. Smart appliances still seem very niche and linited as far as I can tell. What am I missing? What are the big IoT applications right now?
IoT is a way for manufacturers of various gizmos to add (semi-)useless "functionality" that people usually don't want, that make them far harder to use, and inevitably lead to obsolescence and far earlier replacements, pissing off users even more.
What's the point of a "smart" LED lightbulb with an estimated life of 20 years, if its platform stops being updated after 2-3 years, and you're no longer able to control it? This same question applies to _every single_ IoT thing imaginable, just replace "LED lightbulb" with kettle/stove/fridge/water heater/radiator/air conditioner/whatever. IoT has no real open standards, and thus lacks any semblance of future proofing.
Not to mention the bajillion security issues with items like these.
The buzz about IoT is because it sounds futuristic (it really does!), and it has some potential - given time, open standards, security and future proofing. As of now, it's a market flooded with proprietary, gimmicky junk.
What do you imagine IoT is? A smartwatch is the HIGH end of IoT. IoT requires REALLY low power CPUs. And you imagine that tablet-specced Intel chips are going to play a role? Get real. Call me when Intel gets a design win for a smartwatch, or a smartscale, or a fitness tracker.
No, Intel already is claiming to be doing really well financially in IoT and that it will be a major cornerstone of their business in the long-term. I'm saying I don't know where Intel is doing anything noteworthy here. I realize a core i series or atom is way above the power available in watches and most microcontroller applications. But if Intel is making out really well in IoT, what specific products or applications are driving this?
"But if Intel is making out really well in IoT, what specific products or applications are driving this?" I think lies are driving it. More specifically, I think Intel is trying to deliberately mislead investors by booking any revenue it can as IoT as long as their is some vague connection. Xeon's sold for IoT cloud? Counts as IoT. Modems sold to wireless security systems? Counts as IoT. etc etc
This may be legal, but it is not informative from the point of view of understanding future technology. That was my point of the design wins I said I wanted to hear.
(And let's remember that Intel is happy to flat out lie as long as the lie is vague enough that there's no real there there. How long did we hear that Skylake was on track before it was clear that it was actually delayed by about a year? They never actually admitted that; simply claimed that yields were great and it was just that they chose to deliver this tiny dribble of chips for the first six months. Same thing with 10nm; everything was on track until one day it wasn't and we all heard about Koby Lake...)
OK, so you're just a raging anti-Intel fanboy of some sort. Lies? Deliberately misleading investors? Those are serious accusations. Road maps are presented as projections only, and for a reason: No company in its right mind would ever guarantee future performance. Technology setbacks happen, and as a result timetables get pushed back or volumes are constrained for awhile. This isn't new or unique to Intel. AMD/ATI and Nvidia have this problem too. Memory manufacturers announce new SSDs and then are months late supplying the market.
And with Intel, it's particularly understandable just HOW their assumptions worked. Their "tick-tock" release schedule worked for multiple generations of product, with parallel 2-year development cycles (one focused on core design improvements, one focused on process shrinking the most recent core design). It was obvious to anyone they were assuming that they could achieve a process shrink every two years. Anyone familiar with the technology knows that there are new and greater difficulties with each process shrink due to the laws of physics. I'm not sure how it's a lie or deceptive to say "our goal is to achieve a process shrink every two years" when the risks in that statement are all public knowledge.
Oh, and it was known that 10nm would be delayed long before Kaby Lake was announced. Way back in February 2015, a year before announcing Kaby Lake, Intel had already pushed its 10nm product launch back to late 2016/early 2017. The Kaby Lake announcement pushed back the 10nm chips a bit further, to late 2017, and introduced a gap filler product to fill the extended product gap. There was no single, sudden "oh we've been way off all this time and didn't tell you" moment like you're claiming.
Portraying this as some conspiracy to defraud investors is just crazy.
Smartwatches and fitness trackers are compact wearables. These are a very specific category of IoT devices, where size and power are at an absolute premium. And Intel already has a solid product on the market with an embedded Atom CPU. Basis Peak is a connected watch/fitness tracker that has gotten pretty good reviews. It's not perfect, but neither is any other fitness tracker out there. The point is, it's obviously something that works today.
But IoT isn't just about smartwatches and fitness trackers at all. Like I already mentioned, Intel announced an embedded solution for automotive applications. Connected vehicles are the next big thing, and automotive designers can afford to use a 2W tablet-class SoC instead of a 0.5W smartphone SoC. In 2015 there were 17M smartwatches sold, but more than half were Apple Watches with in-house ARM CPUs, and most of the rest were lower-cost, lower-margin parts. There's not much room for Intel to grow into there at the moment. But there were 16.5M vehicles sold in 2015, and that's a wide-open market. Integrated connectivity systems are currently premium upgrades or luxury standard features, making an Atom SoC with integrated LTE an ideal platform with potential for profitability. If Intel rolls out a version with integrated Intel HD graphics, then it has an all-in-one SoC solution that can fully power the built-in infotainment system as well.
This is the market Intel is looking at now. Automotive, industrial, medical, and commercial applications where the SoC is a relatively small part of the system price, making it easier to absorb the premium price Intel wants for a high-performance IoT product. Intel is going to focus on those things for now--but when those things can also downscale to tablets and maybe even phablets, you'll see Intel staying in that market too, if only at the high end.
I desperately went an a Windows phone with continuum that can run all my old crappy Windows apps (like RSAT), that means a full copy of Windoows which needs x86 instructions, Intel could do this and blow the Business Phone market wide open or they could not bother, go away and sulk because no one wants to play with their ball any more.
Most business users don't even know what x86 is. The major desktop programs have all been brought to phones and tablets in at least some form. Tech nerds like us like the idea, but we aren't about to drive the market.
I'm sorry to read this. A phone that can double as a PC is the only somewhat exciting future development I can see in PC space. Now the only phones that could be converted to "PC's" would be ARM based, and that's not nearly as exciting.
I really hope Intel don't stop developing Atom for tablets and low-cost computers. I've been excited for a possible Surface 4 for a while now. Offering a base model Core M for a reasonable price is definitely an option as well but seems close to impossible. Developing a complete solution for smartphones, shipping on time, and having design wins is one thing, and while it's sad to see Intel exit the market, it's understandable, but why can't they keep developing an efficent low-cost solution for PCs? Is it so impossible for Intel to have a low-margin side of their business? - one that doesn't compete directly with their core business no less. No one in the market for a budget PC tablet would even vaguely consider a buying a premium Core M product instead. Not having an efficent low-cost x86 SoC would put a stop to a lot of innovation that's been happening recently. I hope Intel can continue developing Atom.
2007 may have been a good time to spin off (was it Marvel Intel owned?), but keep controlling share of an ARM based mobile vendor. Let it sink or swim on it's own merits. Intel culture would never allow a much lower margin architecture/product undermine x86 business.
The problem now is Not whether ARM ends up in conventional PCs (or Server applications), but do Phones/tablets satisfy most people's computing needs causing a decline in PC class device sales?!
People keep debating when Apple will release an ARM based OS X device, I don't think that is the plan. Near term the iPad Pro is Apple's "releasing an ARM based OS X device" (yes I realize it is iOS, nor the answer for a lot of people, but is arguably a "laptop" answer for a growing number of people). Surface is more of an answer now for many people, but less competitive looking forward to the generation(s) growing up with a touch/App-based world, due to a lack of designed for touch based Apps and no strong phone base. Having corresponding apps on your tablet and phone is more compelling than corresponding PC Apps on your "tablet" without the phone component.
I think the headline should be "Intel cancels Broxton, its smartphone and tablet SoC". What remains of Atom is targeted at "Cloudbooks", i.e. cost-optimised netbook-like devices at or above 11.6 inches in screen size.
No, that's not accurate either. They did not say they're abandoning the mobile space. You're making the same mistake everyone else is, which is assuming that Intel has no alternative plans.
Let's be realistic. ARM is mainly associated with toy devices. Intel and Windows offering the same mobile stuff is pointless as 2 is already a crowd with iOS and Android. Let ARM have the tiny margin market.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
157 Comments
Back to Article
Eden-K121D - Friday, April 29, 2016 - link
Intel Always Intentionally gimped their Atoms. Thank God they Failed Miserably or we would have seen only 10-15% nominal increase Smartphones Performance because Intel Wanted To Milk everyone.Guess What Happens Happens for Good. Intel Chips were Never as good as Competing Snapdragons and Apple AX series processors.and having used smartphones with intel SOCs it struck me that intel SOCs performed poorly compared to competitors.Heck even Mediatek makes better SOCsboozed - Friday, April 29, 2016 - link
What's with the weird capitalisation?Eden-K121D - Saturday, April 30, 2016 - link
Acronyms my bad habitIketh - Saturday, April 30, 2016 - link
What's with the before and after sentence?LiverpoolFC5903 - Sunday, May 1, 2016 - link
The z3580 was plenty competitive in terms of its contemporaries. It is still faster than most mid to upper range socs.Bizarre decision tbh. I was looking forward to a new Intel powered Zenfone 3.
nico_mach - Monday, May 2, 2016 - link
Because they weren't selling the chips at a profit. How much money can they make supplying a chip for a $300 phone? And the 'contra revenue' bit means they were paying someone to use the chips. It was a gamble.I think it's a shocker and shame. I really hoped they could make some deal with Google so that x86 phones would be standard and Google could actually update their phones OSs. Now Android tablets probably have no hope and Nexus continues to be the only real choice.
TheinsanegamerN - Monday, May 2, 2016 - link
What does x86 have to do with google's update model?Alexvrb - Monday, May 2, 2016 - link
Or the update model of OEMs with semi-custom Android.NEDM64 - Friday, May 6, 2016 - link
Simple. Without a Linux kernel, you can't make a ROM that works with modern Android, and you need binary blobs for updating those ARM kernels. Snapdragon and Mediatek don't maintain those blobs that are basically needed for newer Android versions, so OEM's buy the new stuff...On the other hand, X86 platform doesn't need those binary blobs, it just needs vanilla Linux, and that's it.
Wolfpup - Wednesday, November 15, 2017 - link
Huh, that's interesting. Didn't realize that. So x86 still can make a difference even in this space.thek - Friday, April 29, 2016 - link
Can someone explain how come Intel didn't manage to overcome ARM and win the mobile SoC industry? It sounds weird given their insane amount of experience with processors, amount of work power and sources available.thek - Friday, April 29, 2016 - link
I can't seem to edit..anyway, if they do anything in their power- they can't make a SoC which will be at least as good or better then current top SoC's, like SD820 and Apple SoC's?retrospooty - Friday, April 29, 2016 - link
They never really put much priority on it. Server, Desktop, Laptop CPU's were making them several billion in profit per quarter, they just never prioritized making an industry leading mobile chip. If they had prioritized it, they would have at the very least been doing the latest manufacturing node on mobile first, or at the same time... Instead, 1-2 years after the desktop/laptop/server CPU's came out on .32, .22, .14 process, came the comparable Atom. Kind of like Microsoft and Windows Mobile all those years. Half assing it all the way.thek - Friday, April 29, 2016 - link
How does it make sense for Intel to not invest and prioritize in the mobile SoC industry if there are so many smartphones today? Only the android ecosystem has over 1Billion devices. I don't get it.frostyfiredude - Friday, April 29, 2016 - link
1 billion devices making Intel a net profit of 3-5$ is terrible when they could invest their design resources into the datacenter side of things and make billions per quarter on far less chips. Smartphone SoCs are simply not very lucrative because of how low cost they are, irrelevant how many they sell.StevoLincolnite - Saturday, April 30, 2016 - link
The problem there though is that it gives a road for competitors to encroach on Intel's more profitable business's.There were rumblings of ARM entering the datacenter at some point.
FunBunny2 - Saturday, April 30, 2016 - link
-- There were rumblings of ARM entering the datacenter at some point.the problem for ARM and allies is that while one might argue that the real silicon engine in an Intel iWhatever cpu is just a dirt simple RISC machine like ARM's ISA, the assembler level semantics (what compilers target), it's still a CISC machine. IOW, a gaggle of ARM cpu might look like a drop in replacement for an iWhatever, but only for embarrassingly parallel problems, of which there aren't many.
name99 - Saturday, April 30, 2016 - link
Your claim ("the real silicon engine in an Intel iWhatever cpu is just a dirt simple RISC machine like ARM's ISA") is misleading.x86 as a compiler/OS target consists of much more than just the instruction set; it also includes things like the OS model (SMM, PAE, interrupt model, hypervisor stuff, etc) and the memory ordering model. It is THESE things, not cracking complex instructions into simpler instructions, that represents the true x86 complexity.
FunBunny2 - Saturday, April 30, 2016 - link
-- It is THESE things, ... that represents the true x86 complexity.so, what's you're point?:
1 - a gaggle of ARM cpu can do what the X86 does vis-a-vis datacenter since it doesn't have all that cruft, so ARM will win
2 - datacenter is mostly embarrassingly parallel, so ARM will win
3 - all the extra cruft stuffed into iWhatever silicon is needed for datacenter, so Intel will win
extide - Sunday, May 1, 2016 - link
No dude, the x86 complexity is ENTIRELY in the front end, in the decoder. (SMM, PAE, interrupt model, hypervisor stuff, etc) are all things you need to consider in ANY arch, not just x86, or even ARM.sparrow2 - Saturday, April 30, 2016 - link
Yes, Amazon is selling ARM chips to datacenter users.75nationalchamps - Saturday, April 30, 2016 - link
Why nit give the 15 % the option to take a paycut til business picks up? Otherwise you still may be supporting them in the form of foodstamps, medicaid, and welfare.Argosy - Monday, May 2, 2016 - link
One issue is take home pay is maybe half (?) the cost of an employee (consider healthcare, 401K/pension, basic overhead, etc.) to a company like Intel. A 20% cut in take home pay is not a 20% cut in the cost to Intel to keep some one employed. Some of these employees will get "voluntary severance"/early retirement.If you could convince every one in the company to take a 15% paycut, that may be an option. But, then you start to lose your high performers to other companies, etc.
This stuff is not so cut and dry unfortunately.
FullmetalTitan - Sunday, May 8, 2016 - link
As someone who works in the mobile side of the SoC market in manufacturing, I can tell you that the margins are slim. Unless you have a noticeable market share advantage profits can be pretty sensitive to minor market disturbances. Intel's business model requires high profit margins due to their research heavy focus. Why spend $10B on low power SoC R&D to eke out $1B profit over investment costs (assuming you can compete against current ARM producers) when the same $10B into data center, 5G, etc. can net them a guaranteed $5-6B in profit over investment?ianmills - Friday, April 29, 2016 - link
Ask yourself can you get hundreds of dollars selling a mobile chip? The mobile chip market is small potatos compared to intel's other businessRanari - Friday, April 29, 2016 - link
Or maybe because Intel is competing with an x86 chip in an extremely dominant ARM environment?zeo - Saturday, April 30, 2016 - link
Bingo, ARM is basically a very scalable and customizable architecture. Anyone can license it and customize it any way they want and that allows it to be extremely adaptable and very low cost...While Intel's x86 is one size fits all... making it not very customizable at all and while not expensive, still not as cheap as ARM can be... Thus the need to subsidize in order to compete and that's not ever counting how hard it is to gain market share by one already dominated by ARM...
Add, why would they? It's not like they had a urgent need to run software that you'd need a x86 processor to run and initially popular mobile OS like Android had issue with compatibility with all apps on x86 devices that still left a stigma even though it's 99% not a issue anymore.
Samus - Saturday, April 30, 2016 - link
There are Android compilations on x86. Any dualboot Windows/Android tablet runs Androidx86. Even without optimizations it has superior Sunspider performance to most of the ARM competition.Long story short. Intel already has a competitive mobile CPU for everything BUT phones. Braswell is totally competitive in 8"+ tablets where you can get a 5000+mah battery. They just ran the numbers and found at the price they've been selling these chips at, to scale their power down would take more R&D and manufacturing than the additional margins of selling a smartphone variant would cover.
And when they saw those numbers, they pulled the plug. Don't blame them. The smartphone market is saturated as hell and it's becoming less profitable for everybody (even Apple) and Intel will never get Apple onboard with an x86 CPU. They'll be lucky to get many Android OEM's, especially Samsung or those in bed with Qualcomm. Even if they got the WinMo market, it wouldn't matter because less than 4 million Windows Phones were sold last year. Put into perspective, Apple sold more WATCHES than Microsoft sold PHONES.
Intel is right to bail on this market. It isn't going to make them money. Not "Intel" levels of money.
StevoLincolnite - Saturday, April 30, 2016 - link
Plus binary translation.There have been some design wins with Atom in Android, such as the Samsung Galaxy Tab 3 10.1. - And for the most part you wouldn't have known it was an Intel chip.
name99 - Saturday, April 30, 2016 - link
The ecosystem with volume is the ecosysystem that wins in the end. Intel has relegated itself to the position of IBM --- they can continue to sell big expensive systems, but they'll become increasingly marginal to mainstream computing.Like I said above, smartphone SoC revenues (when properly accounted for) likely already exceed Intel revenues. Meanwhile smartwatches are about to take off, and we'll be getting more and more small CPUs in everything --- scales, dashcams, headphones and speakers, VR/Ar headset, cars etc etc. In time this market will probably expand to the size of the smartphone market --- each item is cheap, sure, but you'll own 20 or 30 items each with a small CPU and a small radio in them. And Intel will have no relevance whatsoever to this market...
JimmiG - Sunday, May 1, 2016 - link
"each item is cheap, sure, but you'll own 20 or 30 items each with a small CPU and a small radio in them. And Intel will have no relevance whatsoever to this market..."Except all the servers in all the datacenters that power all those devices run Intel CPUs. Without the cloud, those "smart" devices aren't very smart. The price and profit margins on those server/datacenter CPUs is a lot higher than the $6 SoC in your smartwatch.
Michael Bay - Sunday, May 1, 2016 - link
He`s just an insane fanatic. The line about smartwatches being "about to take off" alone shows he belongs in an institution.xdrol - Saturday, April 30, 2016 - link
You can sell a mobile chip with every toaster, while you can only sell so much chips to desktops, servers and laptops.. Intel is actually a small player counting all chips sold.Samus - Saturday, April 30, 2016 - link
Yes, but Intel's margins are ridiculously higher than anybody. Only Samsung could perhaps have the same margins and that's because they use it through vertical integration, so there is no marketing costs, etc.It's safe to say for every $200 Core i5 sold, Qualcomm must sell TWENTY $34 808 SoC's to make the same profit, because there is so much ARM competition that they probably make $5 per chip, and Intel makes $120 (60%)
As smartphone sales start to slow down, this is especially a market Intel doesn't care to be in. Likewise, data centers are being built at record pace, and 5G towers are going to be going up everywhere in the next 2-3 years, so of course Intel is now focusing solely on Xeon and modems.
ryccoh - Saturday, April 30, 2016 - link
I wonder if they won't lose in datacenters as well though. What if Deep learning technology will be in most data centers and then probably use Nvidia hardware.Ratman6161 - Saturday, April 30, 2016 - link
"Intel is actually a small player counting all chips sold". Perhaps...but a company's stock doesn't ride on raw number sold. It is based on profit. And a small number of relatively expensive Xeon server CPU's is orders of magnitude more profitable than huge numbers of $3 phone CPU's.Some may wonder why Intel was so late to the game. I take the other approach: Why did the bother? They have a market where they are number 1 and likely to stay that way for a long long time i.e. laptops and desktops (admittedly a flat market now but phones and tablets are headed to the same thing) and servers (a big growth area). Getting into phones and tablets was always a distraction from their core business in my book that they shouldn't have bothered with in the first place.
That said, I've got a Z3580 based tablet where I can't tell the difference in performance vs my Galaxy Note 5 phone (yes, Ive seen the benchmarks but they don't matter for my use cases) and if nobody told me it was an Atom CPU in there and I just tried it out? I would never have known...nor would I have cared. So they reached parity but just did it too late and in the end this isnt the business they should be in in the first place.
simard57 - Monday, May 2, 2016 - link
what is the battery performance like in the Z3580? does it match up with ARM?I think the reason Intel is late is their focus had been on MIPS and ARM was on efficiency.... they likely looked ahead at a market that was cresting and decided it wasn't in their best interest.
I do wonder about their IOT angle though. Isnt that the same performance/efficiency curve that smartphones are on?
name99 - Saturday, April 30, 2016 - link
TSMC's revenue is about half Intel's revenue. That's just the TSMC part, there's also (in that revenue stream) the Rockchip, Mediatek, QC, and nV parts. Then there's also the Samsung side of the equation. And the imputed revenue on Apple's side.I'd bet the total is already larger than Intel's total revenue...
mdriftmeyer - Saturday, April 30, 2016 - link
Apple's revenues per year are pushing well past $220 Billion annually.BurntMyBacon - Monday, May 2, 2016 - link
@name99: "TSMC's revenue is about half Intel's revenue. That's just the TSMC part, there's also (in that revenue stream) the Rockchip, Mediatek, QC, and nV parts."TSMC is a fabrication house. Intel doesn't exactly make much money on fabrication. The revenue TSMC makes comes from the Rockchip, Mediatek, QC, nV, AMD, etc., so its more of a negative for ARM profit. Also, TSMC does more than a little bit of business that has nothing to do with ARM chips.
TomWomack - Saturday, April 30, 2016 - link
There are an awful lot of sandwiches sold today, and nonetheless Intel is not investing or prioritising in the sandwich market. The year to be selling smartphones was 2014; it's now basically a replace-as-they-break market, much more cost-driven than it was.retrospooty - Saturday, April 30, 2016 - link
Like I said, they were making billions in profit every quarter on desktop/laptop/server chips... Also, the mobile market was crowded. Also, it was probably a mistake, but that is how it may have seemed to make sense. Call the same thing for MS. From the early 2000's until 2012, how did it make sense for Microsoft to put in a "half assed" effort on a Smartphone OS? Bad choice in hindsight. By the time Windows Phone was actually pretty good, no-one cared to switch.Ratman6161 - Saturday, April 30, 2016 - link
"By the time Windows Phone was actually pretty good, no-one cared to switch." Great analogyzeo - Saturday, April 30, 2016 - link
Actually, they did put priority on it... The ATOM went from a 5 year product cycle to a two year cycle like the Core series... The ATOM also rapidly went on the newest FAB, there was just delays with the 14nm FAB in general that put Intel behind schedule but they still got Cherry Trail out within a year of Broadwell and Cherry Trail even used a Intel gen 8 based GPU like Broadwell's...The problem is ARM dominates the mobile market, is a cheaper technology for OEMs to invest in, OEMs can customize ARM in ways they can't Intel SoCs, and it was basically a hard sell for Intel without any clear advantage, which they couldn't offer without going to their much more pricey Core series technology product...
Sure, profit margins played a role and they also didn't want to give up their profit margins for their higher end products but the ATOM hasn't been a gimped or low priority product for over three years now... It just hasn't been successful enough in the mobile market versus ARM, which is especially doing well in GPU performance, which has started to rival even some of Intel's higher end iGPUs...
Ratman6161 - Saturday, April 30, 2016 - link
"The ATOM went from a 5 year product cycle to a two year cycle like the Core series" But...Apple, Qualcomm and Samsung come out with a new generation every year. Of course I expect this to grind its way down as phones and tablets reach "good enough" and people start losing the itch for constant "upgrades"zeo - Sunday, May 22, 2016 - link
No, when referring to Intel's 2 year product cycle it is referring to an actual cycle between their FAB advances... Meaning everything that happens between one FAB advance to the next, which includes yearly updates...For Intel this usually falls into what is known as the Tick-Tock cycle... The Tick being the FAB advance, which offers the usually smaller product improvement that is then followed by the Tock, which offers the better architecture advancement that improves the product on a given FAB...
Intel previous held a FAB lead because they rapidly advanced to the next FAB... Only recently has ARM managed to close this gap... It helped that Intel has had issues with their FAB and thus had to temporarily extend their cycles to 2.5 years and we may not see the 10nm FAB released from them for close to another two years.
Besides, aside from the switch to 64bit ARM hasn't seen a really big update in awhile...
So no, it wasn't a issue of SoC updates with the competition... Though, you're not entirely off because Intel wasn't updating what went into a typical system as rapidly... Things like 4K cameras, faster storage, etc. are what Intel was really falling behind on.
The advantage to ARM is it is extremely easy to customize and scale... While Intel products was whole package or else... The new Goldmont update would have finally changed this but it was basically too little too late on that score and products are more than just how good the processor are anymore... The entire product ecosystem is what matters in mobile...
Even companies like Nvidia had to give up phones because of this, though that doesn't mean they can't find other niches they better fill... Like Nvidia is doing very well in cars right now...
It remains to be seen if Intel can make a go at IoT, either in the actual products or in the infrastructure that supports those products, such as their new emphasis on 5G...
ViRGE - Friday, April 29, 2016 - link
The short answer: margins.The long answer: Intel could absolutely build a killer smartphone SoC on the 14nm process. However it would need to be Core based; the Atom CPU line isn't performant enough. They may need to go back to Ivy Bridge or strip down Skylake a bit to get there, but they could do it.
The issue is that smartphone SoCs have very low margins. MediaTek, Rockchip, etc are crushing the market. Atom is designed for margins, not performance. If Intel built a winning Core smartphone SoC, then they'd have to sell it for significantly lower margins than what their other Core parts sell for. And that in turn would put pressure on the rest of their chip stack; why should an OEM pay $250 for Core-M when you could get a similar smartphone SoC for $30?
For what it's worth Apple faces a similar challenge. But being vertically integrated means they don't have to share with others, and the total profit off of iPhones/iPads can offset the higher costs involved in developing and fabbing the A-series SoCs. However Intel doesn't have that luxury, and Asus would never let Intel have all of the profit.
Smartphone SoCs are a race to the bottom. And Intel as a business has nothing to gain from taking part in that race. The winner is the guy who got badly hurt but didn't die; a Pyrrhic victory.
madwolfa - Friday, April 29, 2016 - link
^ This guy is correct.Klinky1984 - Friday, April 29, 2016 - link
I agree with this. Intel has been adverse to getting into markets that become commoditized. How many smartphone parts do they have to sell to get the same margin they get from a single sale of server-class CPUs that cost thousands?Intel999 - Saturday, April 30, 2016 - link
You are correct, Virge.However, the margins in smartphones were crap two years ago as well. Why did Intel keep throwing more money at it as long as they did if margins are and were their ultimate goal?
Probably due to the fact that margins are dropping in their PC market and possibly in DCG, as well, based on two quarters of lower ASPs. These "core" segments can no longer provides Intel with sufficient margins to offset this ill advised attempt to be a player in the low end.
They better hope that their FPGA business picks up to offset the major drop in volume in their fabs from a crashing PC market and now non-existent phone processors. With all the fixed costs associated with fabs it is also a margin killer to run fabs at low utilization rates.
I wouldn't be surprised if Intel has already started cutting shifts at some of their fabs starting this quarter. They have a lot of inventory to burn off as it seems they chose to keep the fabs running through Q4 and Q1 holding out hope for a rebound in the PC market that hasn't seemed to arrive yet.
As far as FPGAs are concerned, they missed their guidance in Q1 by over 10% on their new Altera business. It's early, but if they are that off on the guidance how far off were they on the proper amount to spend on Altera in the first place.
ABR - Saturday, April 30, 2016 - link
Some companies saw this writing on the wall earlier, some later. The Apple report makes it official: smartphones have peaked and are entering the commodity phase. Nokia got out of phones when they could, and they aren't coming back. Microsoft realized a couple of steps later. TI and Nvidia, longer ago. Now it's going to be about IoT and cloud, with a few side areas like cars. The big players will acquire, partially adapt, and survive, but the real winners will be those few companies that really get it first, and can execute to perfection.Samus - Saturday, April 30, 2016 - link
Personally I think Atom as it is, is completely competitive with most ARM configurations, especially the mainstream Qualcomm 808 and Exynos 5000. It rivals them in most non-GPU tasks, but obviously in tablets. In smartphones, where it doesn't even exist, it'd be too hot and too power hungry. The R&D to fix this outweighs the possible returns of selling low-margin chips. But the R&D would only be in further shrinking Atom down, not scaling Core to be an SoC. Making Core, even the X3 variant, less than 4 watts with a full onboard modem would be almost impossible without some huge compromises and billions of dollars.TheinsanegamerN - Tuesday, May 3, 2016 - link
But if it only rivals them CPU wise, looses GPU wise, and runs hotter/pulls more power, how is it competitive?When it comes to something like a tablet, battery life is very important, and that is something the ARM market has in the bag. tablet class atoms just use too much juice. Why intel didnt make an arm based atom is beyond me.
zeo - Sunday, May 22, 2016 - link
No, battery life wasn't a issue and neither was heat... It was just margins, GPU, and device hardware ecosystem.But changing to ARM wouldn't have changed enough to matter for Intel... Their expertise is in their technology... Just look at Nvidia, they tried to get into the phone market too and failed. Nvidia just managed to find niches that they do do well in but those niches are not the same as everyone else but in areas they have expertise in... When companies forget this is when they fail...
Argosy - Monday, May 2, 2016 - link
by Nokia "getting out" you mean running out of money unless purchased by MS. Much like Motorola, and Sony....zodiacfml - Saturday, April 30, 2016 - link
It is more complex than that.You are correct but in a different way. In the mid-range to high-end is still lucrative enough for Intel to care. Atom's performance and size isn't for low-end anyway. They could have went with a big Core chip but that it will be at a higher performance level of Apple's chips and no Android can sell at that price niche and is overkill for an Android operating system.
Intel cannot R&D the Atom fast and cheap enough to compete with ARM without resorting to brute forcing it by their manufacturing might.
It is probably priorities as they can't serve two masters at the same time.
See, if they release latest process node, Server/Enterprise products will get the bulk or all of it for the first few months to one year as they pay any money for it just to be on the cutting edge. This is hard to refuse as their long and loyal customers are paying anyway, leaving no capacity for Atoms for a while.
Intel wouldn't care losing profit here as it is more important to get themselves in the smartphone/device ecosystem and have a share of it. This is the reason why they to have to contra-revenue Atoms and hide this from investors, negating the higher margins of the PC/Server products.
OEMs don't want the Atoms despite being paid to use it due to inferior performance and other smaller reasons versus an equivalent ARM. It is a market they couldn't get into and other chip manufacturers are closing in, which I believe led Intel to leave the device SoC market for now.
MS also has a part here as they should have a Windows Mobile OS based on Windows PC x86-x64 as it is faster and easier to deliver and more importantly, app availability. This I believe has a better chance in the smartphone/tablet market and will require the use of Intel's chips.
Samus - Saturday, April 30, 2016 - link
I agree. Microsoft fucked Intel with Windows RT, not the other way around. Intel MADE the Atom for Microsoft, but Microsoft didn't respond with any interest. The closest they came to a lightweight version of Window was 7 Starter Edition which was just a neutered shell, not an optimized kernel.Then Microsoft bought Nokia, a huge F U to Intel, because of Nokia's decades long ties to Qualcomm.
I think after Intel waited around for Microsoft to introduced what we all thought would be an x86 compilation of Windows 10 for mobile, not an ARM port like Windows Mobile always has been, they got sick of waiting, saw the sales number of the WP platform, figured Apple and Samsung wasn't going to ditch ARM since they already have their own in-house development, and really, Intel wants no part of the Android phone market because there is SO much competition they will literally be racing to the bottom.
zodiacfml - Saturday, April 30, 2016 - link
Right, Windows RT is not a good sign for Intel. Windows RT was Microsoft's backup plan if ARM gets a lot better and Intel don't step up their mobile chip offerings. Unfortunately for MS, ARM did not become significantly better and Intel did produce some decent SoCs, albeit, quite late to market.Many articles mention Windows RT as a flop but knowing Windows 10 Mobile to be only available in the ARM architecture, to me it seems it is Windows RT with a different skin additional features. Someone please correct me if I'm wrong here.
This gave me an idea that another reason why Intel threw in the towel is Microsoft not going with x86 for Windows Mobile anymore. MS probably thinks having devices as full fledged Windows PCs will hurt the PC industry even more. It will be for the short term but having users sucked into Windows world is better than them preferring Android/Google or Apple.
lilmoe - Saturday, April 30, 2016 - link
I always thought Atom was about keeping AMD out of business in that area.... Not margins. At least not directly.sonicmerlin - Saturday, April 30, 2016 - link
Imagine if x86 wasn't a controlled duopoly. How cheap would our CPU chips be in that case?LiverpoolFC5903 - Tuesday, May 10, 2016 - link
good post, completely agree.extide - Sunday, May 1, 2016 - link
They COULD but "Intel Internal Politics" would not ALLOW them to.jasonelmore - Friday, April 29, 2016 - link
Late to the game, 300% margins, No integrated LTE take your pickjasonelmore - Friday, April 29, 2016 - link
intel should take that Gen5 LTE and bake it into every single mobile chip it makes for laptops.TheinsanegamerN - Tuesday, May 3, 2016 - link
there are some laptops with integrated modems available. Dell's 5000 and 7000 series latitudes have them.The issue with integrating them is that most people wont use them. That is expensive silicon real estate that isnt being used.
iwod - Saturday, April 30, 2016 - link
Do we now use Margins and Markup Interchangeably? Because what 300% Margin you mean implies a $1 cost of goods selling for $4. Before Margin were used for dividing, where 50% margin implies dividing by 0.5, ending up with markup prices of 100%.MonkeyPaw - Friday, April 29, 2016 - link
Both Intel and ARM SOC builders had the same target, but ARM was going at it from below (improving performance with each generation), while Intel was coming from above (trying to drive down power). I think it was a matter of time before both companies were on a level playing field in terms of performance, but by the time we got there, ARM based chips have dominated the market. Intel has no advantage, and honestly I bet most OEMs are happy to not have to deal with Intel for chips for a change. I can't say that I'm sad, as Intel dropped the ball when they didn't see this coming.misaki - Saturday, April 30, 2016 - link
I don't think that's necessarily true.Qualcomm is basically the Intel counterpart for ARM outside of China and as we saw last year with Snapdragon 810, they completely dropped the ball for a generation of chips and being forced to use the generic ARM design left them with an underperforming and hot chip for the higher end of the market. None of those OEMs were happy with what they got but there was no viable alternative for the market.
Flunk - Friday, April 29, 2016 - link
They didn't support the ARM instruction set that everyone else was using, they purposefully gimped chips to artificially separate models (like they do with PC CPUs), no one else does that.mdriftmeyer - Saturday, April 30, 2016 - link
^ Correct and they sold off all their own ARM investments and chips designed on them before Apple came out with the iPhone. They shot themselves a world of hurt. Apple will soon have no desire to be tethered to Intel by any way shape or form.Intel will be a supply vendor like the rest of them, at Apple's discretion.
beginner99 - Saturday, April 30, 2016 - link
Margins. You can't get >50% margins in that space. You can see that intel crippled Atom both on CPU and GPU side but especially the later. More GPU means bigger die means less money for intel.Second point i've heard is that their whole company structure from design to fab just isn't suited for fast iterations required in the smartphone market.
Krysto - Saturday, April 30, 2016 - link
Company incentives (read: profits) and an unwilling to switch to ARM for mobile. Despite what most others say, ARM is still more efficient in mobile BUT when put on a similar process node as well.abufrejoval - Saturday, April 30, 2016 - link
Ever since Intel entered the server market their business model has been based on their ability to sell expensive server chips that had their research and development funded by a massive desktop population.They've used every trick in the bag to ensure those user groups stayed as separated as possible, so server users couldn't actually use desktop CPUs instead and destroy their margin.
That doesn't work in the other direction, but most importantly, keeping desktop users away from the mobile chips at ARM prices is becoming increasingly difficult: What's the benefit of selling millions of additional mobile SoC if each means a desktop CPU less?
After running RemixOS (Android x86-64) on a €100 N3700 motherboard (no moving parts) I have to conclude that anything more expensive requires special needs (workstation/heavy duty gaming) to justify as a PC, laptop or HDMI stick.
Even Windows is ok, unless your other PC happens to be a 4GHz i7 (where the change of pace becomes noticeable).
Intel is becoming a niche player. They have finally realized it and we can be sure that they'll continue to fight hard and dirty to make sure that niche stays as large and as profitable as they can make it.
name99 - Saturday, April 30, 2016 - link
There is one camp that says this is all economics. Intel was afraid to create good chips in this space because those would compete with their mid-range.I don't buy this. If Intel had though that way, there were MANY different ways to avoid the problem. (For example they could have shipped phone CPUs that were x86-64 only --- able to work with modern compilers and dev tools, but unable to run any older Windows software.)
I think the issue is that same way that x86 fans constantly refuse to admit. The complexity of the x86 ISA (broadly considered; not just the instructions but also the accumulated supervisor details like PAE and SMM, the complicated hardware expectations, and the memory model) is what did them in.
Creating one of their CPUs requires twice as long and vastly more people than on the ARM side. At first they wouldn't believe this and didn't budget enough people and enough time to maintain pace with ARM. By the time they did realize this, it was basically game over. They just could not move as fast as ARM, and once they lost process advantage the future was obvious.
If they could churn out competitive CPUs at the pace (and cost) of Apple or QC, they'd be fine. But complexity costs. They refused to accept that ten years ago, and today we see the consequences.
(And so much for their "We will power the IoT" crap. Yeah, you'll power it with what? Your stupid Quark chip that no-one's interested in and your non-existent 5G modem? Quark shows exactly the same stupidity as Atom, and will fail for exactly the same reasons. By the time Intel have finally shipped a version that matches what smartwatch vendors wanted in 2015, the ARM eco-system will have shipped three impressive updates to their various SoCs.)
ARM is, IMHO, in a fundamentally better place than Intel because they're willing to accept some short term pain in order to tame complexity. They've changed ISA a few times and they've been willing to abandon mistakes or obsolete ideas (like Jazelle). Right now they have to carry the burden of both the v7 (including Thumb) and v8 ISAs (and some hassle related to memory models and the older interrupt model), but we already have the server vendors saying they've dropped everything pre-v8, and I expect Apple will do so in the next year or three, followed by ARM itself. This is a fundamental in engineering --- complexity WILL kill you, unless you CONSTANTLY work to weed it out.
Reflex - Saturday, April 30, 2016 - link
You keep making these assertions no matter how many people correct your misconceptions. I hope it makes you feel better, since obviously you need it.sparrow2 - Saturday, April 30, 2016 - link
Intel hasn't had success with a new design since the Pentium, and that dates to 1993 or so. They managed to recover from the Itanium fiasco thanks to AMD, but that caused them to focus back on x86, while the world was changing.extide - Sunday, May 1, 2016 - link
Basically there are a few reasons. This is my opinion, at least. I think the main point as which they messed up is when the sold off their ARM IP to Marvell, and decided to bring a low-cost, low-power x86 arch in it's place. This created the problem that this chip could now potentially compete with the 'main' x86 arch lineup of chips (currently Core). That means the Atom core has always been artificially limited in terms of performance.See, I think they they, instead, should have kept their custom ARM IP and continued to develop that, they could have easily been a market leader. There are several reasons for this, 1) They would be selling an ARM chip into an ARM market. MUCH easier than selling an x86 chip into an ARM market. 2) Since it was an ARM chip, it wouldn't really potentially compete with the main x86 arch, which would mean it would not have to artificially limited in terms of performance or capability, and 3) I have little doubt that Intel could have designed proprietary ARM IP cores that could compete in terms of performance and efficiency with the likes of Qualcomm, Samsung, and even Apple.
JDG1980 - Sunday, May 1, 2016 - link
Because Intel got arrogant and lazy. They thought their process advantage was big enough that they could beat ARM with one hand tied behind their backs, artificially restricting performance to avoid eating into more lucrative Core sales. Turns out that ARM 1st tier products were more than a match for Intel 2nd tier products. And now that the foundries finally got FinFET right, Intel's process advantage is much less than it once was (probably about a half-node ahead at this point).Had they taken this market seriously from the beginning, they might have won.
digitalgriffin - Monday, May 2, 2016 - link
Intel had legacy issues. They were based on a x86 CISC design which requires a lot of transistors. This means there is a considerable power drain to maintain backwards compatibility with older x86 processors and code. So even though Intel chips used a smaller process node, the extra transistors ate up that advantage.Add to this the fact, that Google didn't actively develop drivers specifically to use Intel chips. Intel invested a ton in programmers to create drivers for android. And quite a few of their products were less then steller in long term stability. (Ask someone who has experience with a few)
The added complexity, with lower profit margin, and need to develop in house drivers became a death nail for Atom in the long term. And Intel is LOATHE to compete in the lower profit RISC/ARM core.
I predict a long term shrink for Intel in the next 15 years of around ~50% market cap due to a number of factors. (People moving to tablets, lack of progress in any real speed improvements, and moore's law limitations) Server market and cloud is a correct strategy. But intel has a slow walk into home computer obsolescence in front of them.
NEDM64 - Friday, May 6, 2016 - link
1 manufacturer vs many?jjpcat@hotmail.com - Friday, April 29, 2016 - link
AMD has already cancelled their cat CPUs. Now if Intel cancels Atom, it will leave a giant void in sub-$250 client computing devices world (desktop, laptop, chrome book, and even tablet).stephenbrooks - Friday, April 29, 2016 - link
Well, the technology has been getting more expensive with each node, while users' upgrade cycles have stretched out because the current generation is "good enough". That means there might not be enough revenue to support the R&D any more (unless they suddenly get another 3 billion users from developing countries or something). So, either the unit prices go up or the R&D slows down (14nm for 3 years rather than 2?) Doubling the lifetime of PCs is great for us, great for the environment, but *halves* the revenue of places like Intel. I guess what I'm saying is this might be an inevitable stage after the initial growth phase.MonkeyPaw - Friday, April 29, 2016 - link
From what I read, they aren't cancelling Atom, just the phone version of Atom. The tablet and netbook class chips will still advance.Vlad_Da_Great - Friday, April 29, 2016 - link
@jjpcat@hotmail.com. Celeron/Pentiums are there as well the CoreM. Atom has not been updated since 2013 and I believe will be eliminated completely.TomWomack - Saturday, April 30, 2016 - link
That is, 'the brand Atom for consumer desktops' may well be eliminated completely. There will still be cheaper chips branded Celeron containing a two-wide out-of-order microarchitecture, next to slightly more expensive chips branded Celeron containing a four-wide out-of-order microarchitecture.alexvoda - Friday, April 29, 2016 - link
What about Denverton, the successor to the Atom based Avoton?I hope they are not abandoning this too. the C2550 and C2750 chips have been great for a firewall/router/nas and it would have been really nice to see a successor.
hechacker1 - Friday, April 29, 2016 - link
I built a NAS/VM server out of a C2750, and it's pretty good, and supports ECC.But honestly, even with 8 cores, it's just not fast enough. If I were to do it again, I'd get the lowest end xeon or i3 that has two fast cores and ECC support.
It's just that intel doesn't make it economical to get ECC and server type features.
hechacker1 - Friday, April 29, 2016 - link
I should also add, with Skylake, power consumption on Avoton isn't even that far away compared to a low power version of an i3.Vlad_Da_Great - Friday, April 29, 2016 - link
Xeon D will chew that market.Alketi - Friday, April 29, 2016 - link
So, the dream of the PC in your phone that you clip into your docking station at work, then put in your pocket to take home, just died...KaarlisK - Friday, April 29, 2016 - link
This. I was really hoping it would be my next phone.Vlad_Da_Great - Friday, April 29, 2016 - link
Not dead my friend, PPD.sorten - Friday, April 29, 2016 - link
I think what's dead is actually the idea of an x86 phone and with it Win32 apps. Windows Store serves 280M people and growing and the Universal Windows platform is gaining momentum. I doubt Microsoft was ever planning on an x86 powered phone.osxandwindows - Saturday, April 30, 2016 - link
UWP is a joke, my friend.Simple as that.
Those mobile apps won't ever replace desktop apps.
zodiacfml - Saturday, April 30, 2016 - link
lol. i agree.For the original poster, I think MS was greedy enough and plan to provide consumers with two separate computing devices, one with the classic Windows PC and Windows mobile as it was the trend back then where people had a PC and a smartphone.
Now, people are happy without a PC because smartphones became so sophisticated that only serious content creators don't use. They should have started early on with a cut-down version of x86 Windows so that app availability is good already from day one.
Murloc - Saturday, April 30, 2016 - link
"people are happy without a PC because smartphones became so sophisticated that only serious content creators don't use"PCs are going to be with us for a long time, simply because of the ergonomics and need to do get work done, and torrenting and light gaming (MOBA, other free games, RTS), what changes is that nowadays average teen-agers and children are okay with having just a 2 years old tablet and a smartphone and use those, they aren't going to be buying a big laptop or even a desktop like they traditionally did to cover their communication needs, time in front of a PC decreases massively if you eliminate these uses and one for the family is enough.
eddman - Saturday, April 30, 2016 - link
How do you know? Just because UWP is limited today compared to Win32, doesn't mean it'll stay that way forever.sorten - Saturday, April 30, 2016 - link
@osxandwindows : Only time will truly tell, but your reference to UWP apps as mobile suggests that you don't understand the architecture and where Windows is headed.I'm a Windows developer and if I can write an app and push it into the Windows store that provides distribution for 280M+ customers for a 25% hit, why would I choose anything else unless I'm just writing a mobile app that doesn't need a "store".
I have written many Windows desktop / Win32 apps and have built custom installers and have distributed that software via CD-ROM or downloads and the UWP experience is much, much better.
sorten - Saturday, April 30, 2016 - link
edit: "web app", not "mobile app"lilmoe - Saturday, April 30, 2016 - link
"Those mobile apps won't ever replace desktop apps"Looks like someone didn't play around with the new Windows XAML and the UWP. This platform is more than capable of replacing fully fledged desktop applications for most users. This isn't Android/iOS (nor Windows Phone) where conventional mouse and keyboard support isn't front and center. Those same UWP applications with UI optimized for mouse and keyboard can scale perfectly for touch and smaller form factors, in the same binary...
Oh, you want a good example? Take a good look at Word and Excel Mobile.
osxandwindows - Saturday, April 30, 2016 - link
No fella, you didn't get the point.osxandwindows - Saturday, April 30, 2016 - link
Sandbox, my friend.Why do you think people prefer the desktop version of adobe CC instead of, you know, universal version.
Hell, lets go a step further and make all windows games universal.
Just look at the dropbox app in the windows store.
osxandwindows - Saturday, April 30, 2016 - link
Hell, lets go a step further and make all windows games universal.To elaborate on this, why not make windows games universal?
Since we're already making "fully fledged desktop applications" lol, why not?
tuxRoller - Saturday, April 30, 2016 - link
And Windows 7 phone is going to take over the smartphone marketSivar - Friday, April 29, 2016 - link
This doesn't sound like good news for a potential Surface Phone.Pinn - Friday, April 29, 2016 - link
I worked on IA64, and many years later after that bombed asked the CEO directly at an employee forum how long they would stick with x86. It was clear they were not ready for big changes.revanchrist - Friday, April 29, 2016 - link
All Asus Zenfone releases in 2016 will move away from Intel to Qualcomm (higher end offerings) and MediaTek (lower end offerings). Not sure if there is any single new phones out there that will continue using outdated Intel chip.willis936 - Friday, April 29, 2016 - link
RIP x86 in your hand. RIP EE job opportunities. RIP architecture competition.Murloc - Saturday, April 30, 2016 - link
ARM and its licensees don't employ EEs?willis936 - Saturday, April 30, 2016 - link
When you flood the market with 10k already trained engineers in an already mediocre looking job opportunity field (look up DoL projections) then I don't think it really matters who is doing the scooping up who since it probably won't be me getting scooped up.iwod - Saturday, April 30, 2016 - link
What will happen to its Modem business? That is the most important question.nemo183 - Saturday, April 30, 2016 - link
iwod, I salute you but would suggest the question should be "What will happen to all businesses?" Please see my later post, chortle at my naivety, and when your sides stop hurting, do say what you think....webdoctors - Saturday, April 30, 2016 - link
Huge zenfone 2 fan. I've had it since it came out and its a great phone, crazy at the sub-200 price point.warreo - Saturday, April 30, 2016 - link
"...Aicha Evans said that she wanted a big contract in 2016, otherwise we might not see her in 2017."I'm confused. Aicha Evans was fired/resigned a few weeks ago, so wouldn't that suggest they are giving up on discrete modems? Did they actually tell you (Ian/Ryan) that discrete modems remain a focus?
Ryan Smith - Saturday, April 30, 2016 - link
Aicha Evans is still at Intel.warreo - Saturday, April 30, 2016 - link
While perhaps technically so, both WSJ and Bloomberg are reporting that she has already given notice of her plans to leave. I'd link the articles but don't know if that is against the rules and they are very easy to find.Given that, it surprised me to see her commentary being used to support Intel's continued discrete modem ambitions.
warreo - Saturday, April 30, 2016 - link
Edit for clarity: Given that, it surprised me to see her commentary being used to support the idea that Intel discrete modem ambitions will continue.warreo - Saturday, April 30, 2016 - link
Ah I missed the follow-up -- Bloomberg reported last week that Brian Krzanich had convinced her to stay. I withdraw my previous comments!LuxZg - Saturday, April 30, 2016 - link
This explains why ASUS divided latest ZenPhones to have both ARM and Intel models. And kind of explains why Microsoft did not make the new Lumias as x86 phones + Continuum.But this news also means that Microsoft's smartphone numbers are now officialy only going to keep declining until they hit zero. My last hope for having usefull x86 device with full desktop in real small format - gone forever.
But I hope they don't kill Atom because that would mean that you can't get a tablet/2-in1 with Windows in 100-300$ range anymore!!! All those CHEAP stick PCs would be gone as well. So many alternative formats...gone... So really - hope they don't kill it. Really , I don't think they will either, as that lineup just started to pick up the pace finally, and with Wintel combo is finally pushing Android from those markets.
zodiacfml - Saturday, April 30, 2016 - link
Don't worry. Only Atoms with modems, simply the smartphones SoCs.zodiacfml - Saturday, April 30, 2016 - link
If they are thinking what I'm thinking, I think they are about to buy another company.watzupken - Saturday, April 30, 2016 - link
I feel Intel has been complacent and joined the party way too late.nemo183 - Saturday, April 30, 2016 - link
Look, there's much informed debate going on here, so I am reluctant to look stupid in asking a question that will seem barboursly naive at the very best. But since it's causing me a considerable amount of grief, I'll have to get over the social embarrassment, given this opportunity to become better informed. So, here goes!Isn't this really the formal announcement of the death of 'Moore's Law of Computing'? Gordon Moore always said it would reach a natural conclusion, an end-point, and this is it. Accepting there are still performance gains to come, both from architecture tweaks and software efficency improvements, hasn't semiconductor technology come to the end of the road?
Since 1965 it has probably been one of the most important drivers of the economy and even today is constantly referred to as the force that will provide the perfomance gains required to power the future for just about anything from the Fourth Industrial Revolution up to national and global security.
There is much talk of the development of different forms of computing - quantum, DNA and so on - the list is endless, but seemingly no general consensus on what, when and by how much. There is certainly nothing in the pipeline that would allow for a seamless transition from one technology to another.
Yet few people mention this or appear concerned. Is this because it's just taken taken as a 'given', accepted by everyone and too depressing to talk about? I can think of many more options, but I won't list them, because I'd like to appeal to this community for their own informed views, for which I'd be very grateful and better informed by.
There is a very serious reason for me asking this. Over the last few years I've been a very junior member of a research project (basically, I buy, lick the stamps and post the letters of the clever people) that has had access to some of the cleverest people alive, which has resulted in a view of the future that is alarming different from what most people expect. One thing is certain - on this topic, nobody agrees, and most are unwilling to state their position due to insufficient information.
Actually, that's not strictly true. There is complete consensus on one scenario. Any talk of a technological singularity, in the terms in which it is generally understood, is ludicrous at best. In addition, the promoters of the idea are talked of using language ruder than I've ever heard used before.
I'd like to ask any member of this community to comment or reply in the knowledge that it serves a serious purpose. I should be clear that the ideas expressed in this post are not ones I hold personally, simply because I don't have the knowledge or experience required to do so. Please be as direct as you wish!
abufrejoval - Saturday, April 30, 2016 - link
First of all, Moore's law is a little more complex than compute performance will double every x months. There are some important economical factors underneath, which are currently changing, and one of those is *direct* competition, which is no longer really happening. Instead we see an evolution into niches and each niche guarded by its own gorilla.One of the cornerstones of Moore type progress was that the square increase in usable transistors due to a linear process density increase could be translated directly into computing power.
That computing power gain curve has hit serious trouble with things like the Gigahertz wall and the resulting core number explosion hasn't resulted in linear gains for many consumer workloads.
It's probably more appropriate to say that Moore's law is running out of steam, especially as the cost of shrinking is rising sharper than the compute power gains (e.g. ELV vs. multi-patterning/multimasking).
But what's coming to an end before Moore's law is general purpose compute, one architecture for all jobs, which is what x86 stands for most.
We've seen this a bit with GPUs taking a huge chunk out of general purpose CPUs both in games and HPC and I believe we are going to see it also at the other side with memory intensive workloads, where transporting data to the CPU for processing is starting to use more energy than the processing itself.
The logical escape for those compute scenarios is to move some of the compute towards the RAM itself and take advantage of the vast parallelism available within those row buffers.
But you won't see Intel pushing that just as you didn't see Intel pushing non x86 GPUs until it was about hurting AMD and Nvidia (today more than 50% of a desktop "CPUs" silicon real-estate is actually "GPU contra revenue" only targeted at keeping the competition starved from revenue).
Intel wants an effective monopoly to maintain their margins and they are fighting hard and dirty to achieve that. Their natural escape valve, the one they have used for decades, faster general purpose CPUs, is getting both more difficult to build and less and less successful in creating value, because special purpose architectures made possible by web scale companies like Amazon, Google and Facebook (and the ARM mobile space), byte (intentionally misspelled ;-) off large chunks of their general purpose CPU slice.
So do not mistake Intel's trouble as a sign that compute will stop to evolve further.
It will just evolve in places an in ways which are less Intel and there is some inertia as the entire IT industry will have to turn to new architectures.
The x86 may well become another mainframe, but won't be the one and only architecture Intel wanted it to be, including graphics (remember the Larrabee?) or including the IoT space (remember the Intel Quark?)
KAlmquist - Saturday, April 30, 2016 - link
Good point. Most of the increase in the computing power of Intel desktop chips over the past few years has been in the GPU. So Moore's law is still operating, but not in a way that helps people who are going to buy a discrete GPU in any case.TheOtherBubka - Saturday, April 30, 2016 - link
The materials and technology path forward in computing power to keep the same pace as what "Moore's Law" has yielded over the last 40+ years is a bit difficult even over the next 10 years, let alone almost half a decade. This is simply due to the laws of physics that Moore himself knew all those years ago. Hence, 'the natural conclusion' you refer to.So what are the challenges? Well, there are 2 major ones. The first is the light source to enable the lithography. EUV is coming along, but the development has not been difficult. Although the research world has options such as e-beam writers and synchotrons, these techniques are not 'fast' techniques relative to current UV light sources and production. They are also expensive. Hence the reason that companies are pushing current UV light sources to the limit by doing things such as double, triple, quadruple exposure/patterning. But each patterning adds what I am estimating as 10 steps (adhesion promoter HMDS, bake, spin photoresist, bake, expose to light, bake, developer, rinse, dry, quality control check) which adds significant time, cost, and increased difficulty for the final product due to alignment issues. Plus, the extra mask costs and development time. Multiply this out for a few of the bottom layers in device and it's no easy task to make things work. Hence the R&D costs and years of hard work by all of the materials scientists, engineers, and staff behind the scenes before it even goes to production line people and their worry about yields, defects, ...
The other major challenge is in the materials side of making a transistor. My take is FinFETs were developed as a means of lowering the leakage rate of electrons across the device (hence reducing the 'Off' state power) by making the gate length equal to a larger node process. But silicon itself has some fundamental limits to it and hence the development of III-V materials on large silicon wafers because even 40 years ago, people knew GaAs was faster (higher electron mobility) than silicon. Further, the recent interest in 2-D materials (WS2, WSe2, MoSe2, graphene, ...) as a means of being able to conduct an electron from one place to another quickly and at lower power. But all of these options take time to develop. Also keep in mind the solution must be able to be scaled to manufacturing speeds and be reproducible beyond normal comprehension to make a billion transistor device at a 100 million per year. One thing the semiconductor industry has done correctly is to make standards (e.g., 300 mm wafers as the defined unit) so multiple companies have a direction to make a machine that works interchangeably with others in the overall manufacturing production line. Compare that to the numerous starters, battery sizes, tires, rims, etc of the automotive world.
So is "Moore's Law" dead? Well, yes if "Moore's Law" is doubling transistors every node jump every 24 months. Node jumps are getting fewer and fewer. Look at any roadmap. But will 'performance' increase with time? Sure. There is a need in terms of computing power per unit energy input. It just won't be as fast and as easy.
As for the rest, sounds like you work at an interesting place!
TheOtherBubka - Saturday, April 30, 2016 - link
Edit: should have been 'EUV is coming along, but the development has not been easy'.zodiacfml - Saturday, April 30, 2016 - link
End of Moore's law for Intel but not for computing as a whole. Actually, it is Intel or consumers decision to drive Moore's law. As long there is no urgent need for compute power, Moore's law will be violated. Yet, if there's an application that will utilize it, Moore's law might be exceeded.If I'm right, Intel is preparing for a big change and focus for that new industry they are going into and they haven't left a clue yet. I think this is the case due to the massive changes that is not related to the revenues.
Shadowmaster625 - Sunday, May 1, 2016 - link
Moore's Law ended years ago. It went from 18 months to 19 months. Then it went from 19 months to 20 months. No one can really say exactly when this happened because it has been happening so gradually.way2funni - Saturday, April 30, 2016 - link
Any thought to the notion that intel gambled on Microsoft's windows mobile OS getting better traction by now?It seems to me there has been and remains a concerted effort to make your phone a very miniaturized PC running a real windows kernel. The newest high end Lumia 950 is a peek at the first layer of that onion being peeled with it's clumsy wired plug display setup .
I always kinda saw that merger happening at a certain convergence in technologies - say, about where 7nm core that does what a Skylake i5 does now along with a 'Thunderbolt III 'AIR" ' level bus where you just walk in , drop your phone on your wireless charge pad and instantly your big screen lights up along with your Kinect and Cortana and whatever their VR setup but you don't have to wear a whole helmet - just an earpiece with a little bar like a mic but at eye level that projects across your eyes.
By about 2020 -2022 Windows has AI and now you're in Tony Stark's lab with JARVIS
plopke - Saturday, April 30, 2016 - link
I just wanne know , is this the end of sub 300 dollar intel tablest and the end of like Asus Transformers?Because , urgh that would suck so much.
haukionkannel - Saturday, April 30, 2016 - link
Depends on what AMD can do with zen...Valantar - Saturday, April 30, 2016 - link
All in all, I think Intel is (sort of) making a mistake here. A little context first, though.I believe what we're seeing is the simple, unavoidable fact that continuous growth (economic or otherwise) is a stupid myth perpetuated by capitalist ideologues lacking contact with reality. We've long since reached a point where the most popular computing devices are "good enough" for >90% of use cases. The cut-off for this has been gradually postponed as new and more practical/accessible form factors have been made possible (the transition from desktop pc -> laptop -> smartphone), but we're running out of "growth opportunities" there as well. The next logical step is smaller devices that can fulfill all the roles of the larger ones (as with Continuum and similar concepts), so that people have one device instead of two or three, but this will again lead to lower hardware sales, not higher. The same goes for performance - as people are more or less happy with smartphone performance now, they don't see the need to upgrade annually any longer. Crossover/all-in-one devices has little potential to mitigate this. To wit: the largest cell operator here in Norway has reported that two years ago, the average smartphone replacement rate was every 14 months. Now, it's every 27 months. In other words, those who bought new phones annually two years ago have since stopped upgrading.
This is unavoidable - and it's a good thing. Electronics waste and overproduction in unregulated countries is wreaking havoc on our planet, doing far more damage than we know. Reduced production, consumption and replacement is good for everyone except bankers and investors in tech companies. On the other hand, this inevitable decline will lead to less profit in the tech sector, and thus less money to spend on R&D - which will slow things down even more. If the companies are interested in surviving at all, they need to shrink investor payouts and allocate a larger portion of profits to future R&D, to avoid a vicious spiral. Investors will be pissed, but that's their modus operandi - they'd rather have huge payouts one year and then see the company fail the next year than have a steady, lower income for years on end.
Tech is inevitably trending towards a commodity economy, with small margins - sure, it's a necessity, but a necessity that to a huge degree is fulfilled by low-cost devices. Thus, cancelling low-cost platforms is a dumb move. However, I don't believe Atom is the right fit for this going forward. As it stands today, the mobile Atom platforms are more akin to the A53s than the A72s or Twisters of the world. It doesn't scale well enough to be future proof. And Core m is too power hungry, and throttles to stupidly low frequencies under sustained loads. Intel needs an in-between architecture, tailored for the 2-10W range, with a smaller die size than Core m. Apple has absolutely nailed this, demonstrated beautifully by them being the only SOC manufacturer able to produce something that didn't overheat on 20nm yet still performed on par with or above competitors. With A9(X), they showed how scalable this architecture was on a properly working process node. I'd love to see an A9X in a Surface Pro-like package (with the passive cooling solution used for the Core m3 version) and a more lenient power limit - that would be an interesting tablet.
Oh, and Intel needs to implement some sort of per-core sleep function (like Android ARM SOCs have), otherwise they'll never be able to scale in the way future devices will require.
KPOM - Saturday, April 30, 2016 - link
Is there any reason why Intel just doesn't get an ARM license and leverage its superior fab process that way? They could potentially even win some fab business from Apple.saratoga4 - Saturday, April 30, 2016 - link
Intel used to have an arm license, they sold it because they couldn't make any money selling super low cost arm parts.That would be even more true today when they'd have to compete with Samsung, Qualcomm, mediatek. Intel is in the business to make money, and they've learned the hard way that selling $200 desktop chips makes a lot more money then $10 parts for asus zenphones.
peterfares - Saturday, April 30, 2016 - link
They still have their ARM license, they just sold their XScale ARM division immediately before the smartphone boom. That wasn't very smart.mdriftmeyer - Saturday, April 30, 2016 - link
Apple has Samsung/GloFo and TMSC to draw upon. They don't need Intel fabs.Reflex - Sunday, May 1, 2016 - link
Intel's fabs are not general purpose like TSMC or GF. They are highly customized and produced in lockstep with new CPU designs. They can't simply fab something else at will. This has given them a tremendous edge over time, they consistently stay 1-2 process nodes ahead of the competition. The downside is that they do not have the flexibility to fab just anything they want.shelbystripes - Saturday, April 30, 2016 - link
I don't get it. I feel like EVERYONE is missing the point here, and panicking over the "death of Atom" when no such thing is happening.Atom has evolved to include two very different lines: The "lite" model and the "full" model. The "lite" model was focused on smartphones, and included PowerVR or Mali graphics. This is a very specific flavor of Atom, and it focused on extreme power savings--and to get there, it used third-party GPUs. Intel killed its plans for Goldmont-based SoFIA chips, which killed Goldmont-based smartphone parts, but also killed off the future of Atoms with third-party graphics.
However, Atom is NOT dead, and the "full" model is clearly alive. Intel has already clarified that it will continue shipping Atom x5 and x7 SOCs based on Cherry Trail. They also just announced Apollo Lake, the Goldmont-based Braswell replacement geared toward "netbooks" and "cloudbooks", as well as "2-in-1s" (which are basically premium tablets). Apollo Lake is designed to be modular and compact; Intel is heavily pitching how thin and light it is using soldered-down LPDDR RAM, eMMC, and Intel 802.11ac Wi-Fi. And again, no external GPU license, it's all Intel in-house.
This looks like a strategy we've seen a lot from Intel lately: They struggle with a new architecture, so they only roll it out in one place first, and keep making the prior-gen chips for everything else. Maybe this means Goldmont has a problem, and that problem has to do with achieving power savings over Airmont. So what does Intel do? Intel rolls out Goldmont first in higher-power devices, while it continues to make Airmont-based Atom x5 and x7 CPUs for now. A future refresh could bring new "Apollo Lake-T" replacements for tablets down the road.
So where did this leave Broxton? Broxton was a major, aggressive redesign. It was supposed to achieve power savings from Goldmont, and the utility of a modular "chassis" design that allowed easy integration of new features, including third-party IP. But what external IP would Intel want to integrate at this point? Intel has mature in-house 4G/LTE modems now, killing SoFIA puts an end to Intel's external graphics licensing (and Broxton was supposed to use Intel HD graphics regardless), so this looks like Intel abandoning third-party IP integration in its CPU/SoC development. The future for Intel will be aggressively developing its own graphics (which the Core team is already doing) and modems (see the aggressive 5G/connectivity push). Intel will want even tighter integration of these in-house components than a generalized modular design can deliver, so I don't see the point of Broxton anymore. Broxton isn't low-power enough to replace SoFIA, today it would be redundant to an Apollo Lake-T part, and tomorrow doesn't look like it's bringing third-party IP anymore.
Intel didn't say it's just giving up the mobile market. Their full statement says the opposite; it says that they'll try "to drive more profitable mobile and PC businesses." All of this refocuses Intel on using its in-house GPUs and moving away from third-party licenses. So what is Intel really abandoning here? They're not abandoning the mobile market. They're abandoning PowerVR GPUs and third-party IP integration. They're also abandoning out-licensing to Rockchip and Spreadtrum, which means keeping their in-house IP in-house.
So does Intel intend to just throw away SoFIA? There's a lingering question from the big 5G/connectivity push that Intel announced in February of this year. Alongside their 5G tease and some new 4G/LTE modems, they rolled out a brand-new Atom x3 part. This wasn't a Goldmont-based SoFIA part, and it's not identified anywhere as a "SoFIA" Part. The Atom x3-M7272 is a quad-core 64-bit Atom SoC with an integrated LTE/3G/2G modem for use in IoT, automotive, and embedded devices. It uses LPDDR2/LPDDR3 and the feature-set otherwise overlaps with the x3-C3445, so it was obviously intended to be based on the Airmont SoFIA LTE, the only SoFIA chips to be fabbed in-house at Intel. But the M7272 has no GPU; this isn't for infotainment, it's promoted for automotive connectivity. It was announced with a "long life availability & support program", so Intel clearly planned to keep making it for a very long time. It seems really odd that Intel would kill SoFIA to focus on IoT and connectivity solutions, but in the process, kill a key IoT solution it announced just two months ago. However, the M7272 isn't identified anywhere on Intel's website as a "SoFIA LTE" part, even though the C3405 and C3445 are.
With the SoFIA 3G parts cancelled, it doesn't make much sense to launch just the C3405 and C3445 as an incomplete x3 product line. But the only thing truly unique about the x3, that isn't already supported by other Intel teams, was the third-party graphics. The rest of the SoC, including the CPU and integrated modem, are all current-gen Intel IP. It doesn't take much to just disable the GPU and crank out IoT Atoms as a fully supported, short-term solution.
Given all that, here's my predictions for Intel's near future:
1) Intel goes ahead and rolls out the M7272 "v1.0" using a SoFIA LTE chip with the GPU disabled. They're 98% there already, and they need it as their current-gen IoT solution.
2) Reallocated personnel are used to fast-track a previously unannounced Atom core with an integrated LTE modem, IoT utility, and no third-party IP. This would be a die shrink of SoFIA LTE, except that it would replace the Mali GPU with a bare minimum number of Intel GPU EUs for applications that require video. Since this is all in-house IP, Intel already has experience building Airmont and Intel HD Graphics on smaller processes, and it would be fabbed internally, this should take far fewer resources than SoFIA is now.
3) Intel announces "Apollo Lake-T" for tablets and its new in-house SoFIA replacement for IoT, embedded, and maybe even high-end smartphone devices.
4) Once the SoFIA replacement is up and running, Intel quietly switches over to an M7272 "v2.0" that is a drop-in replacement for the M7272. Like v1.0, it's the low-end Atom chip with the GPU disabled, but this lets Intel stop making the v1.0 part. Intel can also launch new IoT/embedded parts based on this product, building it out into a larger product line as part of their IoT and connectivity platform.
Burner73731125975 - Saturday, April 30, 2016 - link
What constitutes the IoT? I hear a lot of buzz about it, but I've yet to see anything outside of hobbyist boards, Nest thermostats, and the like. Smart appliances still seem very niche and linited as far as I can tell. What am I missing? What are the big IoT applications right now?Valantar - Sunday, May 1, 2016 - link
IoT is a way for manufacturers of various gizmos to add (semi-)useless "functionality" that people usually don't want, that make them far harder to use, and inevitably lead to obsolescence and far earlier replacements, pissing off users even more.What's the point of a "smart" LED lightbulb with an estimated life of 20 years, if its platform stops being updated after 2-3 years, and you're no longer able to control it? This same question applies to _every single_ IoT thing imaginable, just replace "LED lightbulb" with kettle/stove/fridge/water heater/radiator/air conditioner/whatever. IoT has no real open standards, and thus lacks any semblance of future proofing.
Not to mention the bajillion security issues with items like these.
The buzz about IoT is because it sounds futuristic (it really does!), and it has some potential - given time, open standards, security and future proofing. As of now, it's a market flooded with proprietary, gimmicky junk.
name99 - Saturday, April 30, 2016 - link
What do you imagine IoT is?A smartwatch is the HIGH end of IoT. IoT requires REALLY low power CPUs. And you imagine that tablet-specced Intel chips are going to play a role? Get real.
Call me when Intel gets a design win for a smartwatch, or a smartscale, or a fitness tracker.
Burner73731125975 - Saturday, April 30, 2016 - link
No, Intel already is claiming to be doing really well financially in IoT and that it will be a major cornerstone of their business in the long-term. I'm saying I don't know where Intel is doing anything noteworthy here. I realize a core i series or atom is way above the power available in watches and most microcontroller applications. But if Intel is making out really well in IoT, what specific products or applications are driving this?name99 - Saturday, April 30, 2016 - link
"But if Intel is making out really well in IoT, what specific products or applications are driving this?"I think lies are driving it. More specifically, I think Intel is trying to deliberately mislead investors by booking any revenue it can as IoT as long as their is some vague connection. Xeon's sold for IoT cloud? Counts as IoT. Modems sold to wireless security systems? Counts as IoT. etc etc
This may be legal, but it is not informative from the point of view of understanding future technology. That was my point of the design wins I said I wanted to hear.
(And let's remember that Intel is happy to flat out lie as long as the lie is vague enough that there's no real there there. How long did we hear that Skylake was on track before it was clear that it was actually delayed by about a year? They never actually admitted that; simply claimed that yields were great and it was just that they chose to deliver this tiny dribble of chips for the first six months. Same thing with 10nm; everything was on track until one day it wasn't and we all heard about Koby Lake...)
shelbystripes - Monday, May 2, 2016 - link
OK, so you're just a raging anti-Intel fanboy of some sort. Lies? Deliberately misleading investors? Those are serious accusations. Road maps are presented as projections only, and for a reason: No company in its right mind would ever guarantee future performance. Technology setbacks happen, and as a result timetables get pushed back or volumes are constrained for awhile. This isn't new or unique to Intel. AMD/ATI and Nvidia have this problem too. Memory manufacturers announce new SSDs and then are months late supplying the market.And with Intel, it's particularly understandable just HOW their assumptions worked. Their "tick-tock" release schedule worked for multiple generations of product, with parallel 2-year development cycles (one focused on core design improvements, one focused on process shrinking the most recent core design). It was obvious to anyone they were assuming that they could achieve a process shrink every two years. Anyone familiar with the technology knows that there are new and greater difficulties with each process shrink due to the laws of physics. I'm not sure how it's a lie or deceptive to say "our goal is to achieve a process shrink every two years" when the risks in that statement are all public knowledge.
Oh, and it was known that 10nm would be delayed long before Kaby Lake was announced. Way back in February 2015, a year before announcing Kaby Lake, Intel had already pushed its 10nm product launch back to late 2016/early 2017. The Kaby Lake announcement pushed back the 10nm chips a bit further, to late 2017, and introduced a gap filler product to fill the extended product gap. There was no single, sudden "oh we've been way off all this time and didn't tell you" moment like you're claiming.
Portraying this as some conspiracy to defraud investors is just crazy.
shelbystripes - Monday, May 2, 2016 - link
Smartwatches and fitness trackers are compact wearables. These are a very specific category of IoT devices, where size and power are at an absolute premium. And Intel already has a solid product on the market with an embedded Atom CPU. Basis Peak is a connected watch/fitness tracker that has gotten pretty good reviews. It's not perfect, but neither is any other fitness tracker out there. The point is, it's obviously something that works today.But IoT isn't just about smartwatches and fitness trackers at all. Like I already mentioned, Intel announced an embedded solution for automotive applications. Connected vehicles are the next big thing, and automotive designers can afford to use a 2W tablet-class SoC instead of a 0.5W smartphone SoC. In 2015 there were 17M smartwatches sold, but more than half were Apple Watches with in-house ARM CPUs, and most of the rest were lower-cost, lower-margin parts. There's not much room for Intel to grow into there at the moment. But there were 16.5M vehicles sold in 2015, and that's a wide-open market. Integrated connectivity systems are currently premium upgrades or luxury standard features, making an Atom SoC with integrated LTE an ideal platform with potential for profitability. If Intel rolls out a version with integrated Intel HD graphics, then it has an all-in-one SoC solution that can fully power the built-in infotainment system as well.
This is the market Intel is looking at now. Automotive, industrial, medical, and commercial applications where the SoC is a relatively small part of the system price, making it easier to absorb the premium price Intel wants for a high-performance IoT product. Intel is going to focus on those things for now--but when those things can also downscale to tablets and maybe even phablets, you'll see Intel staying in that market too, if only at the high end.
lorribot - Saturday, April 30, 2016 - link
I desperately went an a Windows phone with continuum that can run all my old crappy Windows apps (like RSAT), that means a full copy of Windoows which needs x86 instructions, Intel could do this and blow the Business Phone market wide open or they could not bother, go away and sulk because no one wants to play with their ball any more.Burner73731125975 - Saturday, April 30, 2016 - link
Most business users don't even know what x86 is. The major desktop programs have all been brought to phones and tablets in at least some form. Tech nerds like us like the idea, but we aren't about to drive the market.ET - Sunday, May 1, 2016 - link
I'm sorry to read this. A phone that can double as a PC is the only somewhat exciting future development I can see in PC space. Now the only phones that could be converted to "PC's" would be ARM based, and that's not nearly as exciting.Murloc - Sunday, May 1, 2016 - link
Remix OS is an interesting development in this regard. It's a full-desktop Android fork.So what you're saying can still happen, but on ARM + Android.
yhselp - Sunday, May 1, 2016 - link
I really hope Intel don't stop developing Atom for tablets and low-cost computers. I've been excited for a possible Surface 4 for a while now. Offering a base model Core M for a reasonable price is definitely an option as well but seems close to impossible. Developing a complete solution for smartphones, shipping on time, and having design wins is one thing, and while it's sad to see Intel exit the market, it's understandable, but why can't they keep developing an efficent low-cost solution for PCs? Is it so impossible for Intel to have a low-margin side of their business? - one that doesn't compete directly with their core business no less. No one in the market for a budget PC tablet would even vaguely consider a buying a premium Core M product instead. Not having an efficent low-cost x86 SoC would put a stop to a lot of innovation that's been happening recently. I hope Intel can continue developing Atom.yhselp - Sunday, May 1, 2016 - link
P.S. I guess those $150 Windows tablets were too good to be true for long - couldn't have been profitable for either Intel, or Microsoft.Argosy - Monday, May 2, 2016 - link
2007 may have been a good time to spin off (was it Marvel Intel owned?), but keep controlling share of an ARM based mobile vendor. Let it sink or swim on it's own merits. Intel culture would never allow a much lower margin architecture/product undermine x86 business.The problem now is Not whether ARM ends up in conventional PCs (or Server applications), but do Phones/tablets satisfy most people's computing needs causing a decline in PC class device sales?!
People keep debating when Apple will release an ARM based OS X device, I don't think that is the plan. Near term the iPad Pro is Apple's "releasing an ARM based OS X device" (yes I realize it is iOS, nor the answer for a lot of people, but is arguably a "laptop" answer for a growing number of people). Surface is more of an answer now for many people, but less competitive looking forward to the generation(s) growing up with a touch/App-based world, due to a lack of designed for touch based Apps and no strong phone base. Having corresponding apps on your tablet and phone is more compelling than corresponding PC Apps on your "tablet" without the phone component.
My 2 ¢
shelbystripes - Thursday, May 5, 2016 - link
I'm glad the author belatedly realized that this wasn't a cancellation of Atom, and fixed the headline and article.Klug4Pres - Wednesday, May 11, 2016 - link
I think the headline should be "Intel cancels Broxton, its smartphone and tablet SoC". What remains of Atom is targeted at "Cloudbooks", i.e. cost-optimised netbook-like devices at or above 11.6 inches in screen size.shelbystripes - Tuesday, May 24, 2016 - link
No, that's not accurate either. They did not say they're abandoning the mobile space. You're making the same mistake everyone else is, which is assuming that Intel has no alternative plans.dwade123 - Thursday, May 19, 2016 - link
Let's be realistic. ARM is mainly associated with toy devices. Intel and Windows offering the same mobile stuff is pointless as 2 is already a crowd with iOS and Android. Let ARM have the tiny margin market.