The Fall of Intel

Eric

Mama's lil stinker
Joined
Aug 10, 2020
Posts
12,936
Solutions
18
Main Camera
Sony
Instagram
is it a fall or just wanting more profits?
The latest Intel CPUs have been slagging themselves. In many cases, Intel will provide replacements (a real expense), but having a server-brick is inconvenient at best and a major loss in revenue at worst. This is very bad for Intel's reputation, especially with AMD right on their heels and MS working on an ARM transition.

I am sad/not-sad. To me, the biggest contribution they made to the industry has been to set a standard that basically almost anyone with some imagination can do better than. But there is so much out there that is better these days that Intel's usefulness has expired.

Yes, most of the Top500 supercomputers are running some form of x86 processor. But the heavy work they are using those is now almost entirely done on GPU-type cards. There is nothing great or magic about Intel's designs – I find them to be kludgy and outdated.
 
The latest Intel CPUs have been slagging themselves. In many cases, Intel will provide replacements (a real expense), but having a server-brick is inconvenient at best and a major loss in revenue at worst. This is very bad for Intel's reputation, especially with AMD right on their heels and MS working on an ARM transition.

I am sad/not-sad. To me, the biggest contribution they made to the industry has been to set a standard that basically almost anyone with some imagination can do better than. But there is so much out there that is better these days that Intel's usefulness has expired.

Yes, most of the Top500 supercomputers are running some form of x86 processor. But the heavy work they are using those is now almost entirely done on GPU-type cards. There is nothing great or magic about Intel's designs – I find them to be kludgy and outdated.

They've had a good run, going back to the 1970s. I still remember some 8080 opcodes :) . Times change.
 
The sad part is that they could turn things around. But, as I said when they hired Gelsinger, he’s not the guy to do it.
 
Oh ... wow ... to be honest I'd forgotten they were that big! Sad about all those people losing their jobs though. :(
 
Yoe think so? They have a corporate culture that is noted for its inertia and intransigence. I am skeptical that they can find a rudder big enough to turn that hard.
well, that’s the issue, isn’t it? The proper CEO, who can see the writing on the wall and who is willing to cannibalize current product lines could do it. Short term pain for long term improvement. Instead we get Gelsinger who thinks the solution is to just declare that everything will be fine and who keeps making promises about timelines, without considering that maybe even with faster timelines, the products they are going to make aren’t worth making.
 
well, that’s the issue, isn’t it? The proper CEO, who can see the writing on the wall and who is willing to cannibalize current product lines could do it. Short term pain for long term improvement. Instead we get Gelsinger who thinks the solution is to just declare that everything will be fine and who keeps making promises about timelines, without considering that maybe even with faster timelines, the products they are going to make aren’t worth making.
May I ask: what would you do if you were at the helm?
 
this week
IMG_5546.jpeg

(I do not put a lot of stock into stock prices, but …)
 
I hear they're going to call their next chip Dry Lake.

Seriously, sorry to see so many people lose their jobs, but it seems the writing's been on the wall for some time.
 
May I ask: what would you do if you were at the helm?

At this point? Well, things are much more broken than they were when Gelsinger took over, so I guess at this point I’d have to take even more drastic measures. Probably I’d (1) fully embrace the future, which will be RISC chips with integrated graphics solutions. So I’d set my design teams on that. Arm with integrated graphics, aiming to destroy QC at that game, and offering customers the ability to get custom designs - let dell pick how many cores, how many graphics cores, pick from a menu of IP blocks, etc. (2) spin off the fabs. (3) put minimal efforts into x86-64 compatible designs. x86-64 isn’t long for the world, and “lots of people still buy them” were the last words of lots of tech companies in the past. Keep that business on life support for legacy uses, for now, but you don’t need a new architecture with 30 SKUs every year. (4) announce a new ISA that looks a lot like x86-64 but without all the cruft. No more variable length instructions, no more 8/16/32 bit support. Put my best minds on the first implementation, and beat the best Arms in performance-per-watt (which would be possible if they get rid of all that junk). Work with MS to port windows to it and on a translation layer to allow old software to keep running decently. But announce that x86-64 is dead at Intel. (Hell, they didn’t even invent it, so pride isn’t a good reason to keep it).
 
At this point? Well, things are much more broken than they were when Gelsinger took over, so I guess at this point I’d have to take even more drastic measures. Probably I’d (1) fully embrace the future, which will be RISC chips with integrated graphics solutions. So I’d set my design teams on that. Arm with integrated graphics, aiming to destroy QC at that game, and offering customers the ability to get custom designs - let dell pick how many cores, how many graphics cores, pick from a menu of IP blocks, etc. (2) spin off the fabs. (3) put minimal efforts into x86-64 compatible designs. x86-64 isn’t long for the world, and “lots of people still buy them” were the last words of lots of tech companies in the past. Keep that business on life support for legacy uses, for now, but you don’t need a new architecture with 30 SKUs every year. (4) announce a new ISA that looks a lot like x86-64 but without all the cruft. No more variable length instructions, no more 8/16/32 bit support. Put my best minds on the first implementation, and beat the best Arms in performance-per-watt (which would be possible if they get rid of all that junk). Work with MS to port windows to it and on a translation layer to allow old software to keep running decently. But announce that x86-64 is dead at Intel. (Hell, they didn’t even invent it, so pride isn’t a good reason to keep it).
I've definitely seen similar suggestions proffered before.

To quote the article, the losses are almost all in Intel's fabs. Right now the rest of the company appears to be subsidizing the rate of node progression and capacity buildout the fabs require to meet their IDM 2.0 goals which of course are going to take a long time to hit. I don't think there are any shipping products from outside parties on Intel fabs. Intel has announced a few high profile foundry customers but not what they'll be making on Intel fabs, which fabs they will use, or when to expect products. Sinking so much capital into fabs is a very risky bet that if it doesn't pay off could sink the company and force them to spin off the fabs regardless. Of course it may yet work, these things take time, and if they do Gelsinger will be hailed as a genius, but ... if it doesn't ...

It's amusing that Forbes earlier this year was touting how much of a success Gelsinger's plan is already with a pair of articles on how it was time to take Intel as a foundry seriously. Again, over time, maybe. But right now, they're having to burn through money like there's no tomorrow ... or like they are about to receive $8.5 billion in government subsidies.
 
Work with MS to port windows to it and on a translation layer to allow old software to keep running decently. But announce that x86-64 is dead at Intel. (Hell, they didn’t even invent it, so pride isn’t a good reason to keep it).
This seems like such an obvious move. If a completely different ISA (Aarch64) can translate at a reasonable speed I’d imagine if Intel designs a new ISA with x86-64 translation in mind I’d imagine they might actually hit faster speeds than existing CPUs.
 
This seems like such an obvious move. If a completely different ISA (Aarch64) can translate at a reasonable speed I’d imagine if Intel designs a new ISA with x86-64 translation in mind I’d imagine they might actually hit faster speeds than existing CPUs.
The issue is one of the HMS Dreadnaught. The British navy was the 19th century's preeminent naval forces, even the arrival of the Iron Clads didn't really change that. But the Brits own invention of the HMS Dreadnaught did. The design was so radical and powerful that it made every other ship in the world obsolete. Great! Including their own. Not so great. Suddenly everyone, including the Brits, was effectively starting from scratch and every other European power could now build their own navy and achieve parity with the Royal Navy. The Brit's slight head start with the Dreadnaught meant little compared to their previous builtin advantage of 100+ years of wooden ship supremacy.

Of course in this case ARM would be the HMS Dreadnaught and Intel is very much hoping that x86 isn't the wooden ship of yesteryear. Regardless, Intel are desperate not to abandon one of the biggest advantages they have: a legacy of domination with x86 software and no compatibility layer needed. If they level that playing field even a little such that everyone needs x86 compatibility anyway, then it might as well come down to native performance as customer's programs move to the new ISAs. Relatedly the discussion over in the Qualcomm thread delved into how inflexible Microsoft Windows is trying to support both x86 and ARM and you can see why Intel are so focused on making x86 better (i.e. more ARM like) rather than abandoning it completely. I'm not saying it's the right thing to do and their own plan is really quite risky as well, but I can understand it. And who knows? With my crystal ball still in the shop, maybe it'll work, but boy do they have some hurdles.
 
The issue is one of the HMS Dreadnaught. The British navy was the 19th century's preeminent naval forces, even the arrival of the Iron Clads didn't really change that. But the Brits own invention of the HMS Dreadnaught did. The design was so radical and powerful that it made every other ship in the world obsolete. Great! Including their own. Not so great. Suddenly everyone, including the Brits, was effectively starting from scratch and every other European power could now build their own navy and achieve parity with the Royal Navy. The Brit's slight head start with the Dreadnaught meant little compared to their previous builtin advantage of 100+ years of wooden ship supremacy.

Of course in this case ARM would be the HMS Dreadnaught and Intel is very much hoping that x86 isn't the wooden ship of yesteryear. Regardless, Intel are desperate not to abandon one of the biggest advantages they have: a legacy of domination with x86 software and no compatibility layer needed. If they level that playing field even a little such that everyone needs x86 compatibility anyway, then it might as well come down to native performance as customer's programs move to the new ISAs. Relatedly the discussion over in the Qualcomm thread delved into how inflexible Microsoft Windows is trying to support both x86 and ARM and you can see why Intel are so focused on making x86 better (i.e. more ARM like) rather than abandoning it completely. I'm not saying it's the right thing to do and their own plan is really quite risky as well, but I can understand it. And who knows? With my crystal ball still in the shop, maybe it'll work, but boy do they have some hurdles.

Nobody at Intel is under the misapprehension that x86-64 can win, long term. The only inherent advantage it might have is instruction density. It will always lose in perf/$$, perf/watt, etc. If it wins on abs(perf) that’s only because the Arm people don’t have financial incentive to make those chips. Abs(perf) has become less and less important, especially as the magnitude of the abs(perf) advantage has shrunk.

Add in the fact that the cpu has decreased in performance compared to GPUs, neural engines, etc., and that trend is continuing.

So all they have left is “we run all that old software from 1992!” But they even have competition there with AMD. And compatibility is also becoming less and less important. Any modern chip running an emulation layer is likely able to run that 1992 software a thousand times faster than native hardware from 1992.

The time is rapidly approaching where Intel will have to compete entirely on perf/watt, and if they wait any longer it will be too late.
 
Nobody at Intel is under the misapprehension that x86-64 can win, long term. The only inherent advantage it might have is instruction density. It will always lose in perf/$$, perf/watt, etc. If it wins on abs(perf) that’s only because the Arm people don’t have financial incentive to make those chips. Abs(perf) has become less and less important, especially as the magnitude of the abs(perf) advantage has shrunk.

Add in the fact that the cpu has decreased in performance compared to GPUs, neural engines, etc., and that trend is continuing.

So all they have left is “we run all that old software from 1992!” But they even have competition there with AMD. And compatibility is also becoming less and less important. Any modern chip running an emulation layer is likely able to run that 1992 software a thousand times faster than native hardware from 1992.

The time is rapidly approaching where Intel will have to compete entirely on perf/watt, and if they wait any longer it will be too late.
I understand, I’m just laying out the reasons why they don’t want to risk pouring money into developing a new ISA, especially given how Itanium went. If using the new ISA they can’t develop CPUs with as good perf/W or $$ as ARM’s then they will just lose on native performance anyways. So that’s why they’re hoping everything you’re writing isn’t true. That what will actually happen is that as the accelerators become increasingly important, that the x86 deficiencies will simply not matter much and ARM’s advantages will simply be unimportant. x86 will thus live on and on and on. Make no mistake I agree with you, but I can see why they are so hesitant to throw their biggest advantage, market entrenchment, out.
 
I understand, I’m just laying out the reasons why they don’t want to risk pouring money into developing a new ISA, especially given how Itanium went. If using the new ISA they can’t develop CPUs with as good perf/W or $$ as ARM’s then they will just lose on native performance anyways. So that’s why they’re hoping everything you’re writing isn’t true. That what will actually happen is that as the accelerators become increasingly important, that the x86 deficiencies will simply not matter much and ARM’s advantages will simply be unimportant. x86 will thus live on and on and on. Make no mistake I agree with you, but I can see why they are so hesitant to throw their biggest advantage, market entrenchment, out.

The problem is they’ve got no good story to tell re: the “accelerators” either. In tech., you can’t survive long term if you keep selling the same product. Yet, that’s really their strategy.
 
The problem is they’ve got no good story to tell re: the “accelerators” either. In tech., you can’t survive long term if you keep selling the same product. Yet, that’s really their strategy.
Aye, the thing that really stands out to me about the Tom Forsyth article @mr_roboto posted was how almost antagonistic Intel was to its accelerator teams ... and continued to be long after that article was written as far as I can tell. That along with the fab failures a decade ago and well ... suddenly it's a lot less surprising that Intel is burning through money now to make up for it and will probably have to continue to do so for quite some time to pull out of the stall - if they ever do.

Intel's other two main business advantages are its immense size (yes that's bad for steering a ship away from the icebergs, but sheer momentum has carried Intel right through them before) ... and the fact that Apple, the current industry leading chip designers, aren't primarily in the business of selling processors. Intel would've been in much worse shape over the last 4 years (and the next 4 years too) if Apple wasn't a "lifestyle company". Of course as a counterfactual that ignores that Apple is in the financial position is its in because its not in the chip business, but still the point remains that if Apple were Qualcomm (never mind if they were AMD), Intel would be in significantly more trouble in the chip market. They might not even have the financials from the rest of the company to propel their fabs. I might be exaggerating a little, but that hole in their budget would at least be a hell of a lot bigger. That's one reason why I was so mystified when Gelsinger first took over that Intel bothered to focus on Apple at all: just count your lucky stars that they aren't your main competitor, that they aren't even in the same business as you (not really), and don't keep drawing even more end user attention to their products.
 
Last edited:
Aye, the thing that really stands out to me about the Tom Forsyth article @mr_roboto posted was how almost antagonistic Intel was to its accelerator teams ... and continued to be long after that article was written as far as I can tell. That along with the fab failures a decade ago and well ... suddenly it's a lot less surprising that Intel is burning through money now to make up for it and will probably have to continue to do so for quite some time to pull out of the stall - if they ever do.

Intel's other two main business advantages are its immense size (yes that's bad for steering a ship away from the icebergs, but sheer momentum has carried Intel right through them before) ... and the fact that Apple, the current industry leading chip designers, aren't primarily in the business of selling processors. Intel would've been in much worse shape over the last 4 years (and the next 4 years too) if Apple wasn't a "lifestyle company". Of course as a counterfactual that ignores that Apple is in the financial position is its in because its not in the chip business, but still the point remains that if Apple were Qualcomm (never mind if they were AMD), Intel would be in significantly more trouble in the chip market. They might not even have the financials from the rest of the company to propel their fabs. I might be exaggerating a little, but that hole in their budget would at least be a hell of a lot bigger. That's one reason why I was so mystified when Gelsinger first took over that Intel bothered to focus on Apple at all: just count your lucky stars that they aren't your main competitor, that they aren't even in the same business as you (not really), and don't keep drawing even more end user attention to their products.

Another problem Intel has is that I suspect more of their biggest customers may soon be their competitors. You don’t have to be as good as Apple’s chip designers in order to be better than Intel’s. Eventually one or two more Intel customers will likely start designing their own chips, too, in order to differentiate their products. Even Arm’s stock designs will be able to more or less match Intel soon enough. So Dell or whoever can put together an SoC with some very nice Arm cores, some accelerators sourced from other design firms, some Dell-specific special sauce, etc. The genie isn’t going back into the bottle. One-size-fits-all chips are going to look pretty lame before too long, whether they are Arm or not. There are just so many market forces working against Intel.
 
Back
Top