Gelsinger being Gelsinger…

SuperMatt

Site Master
Posts
7,862
Reaction score
15,004
Intel is going to get a nice corporate welfare check so why not?
Yep, put all your extra money into dividends now because your lobbyists got you tens of billions of taxpayer money to invest in the actual chip fabs!
 

Citysnaps

Elite Member
Staff Member
Site Donor
Posts
3,702
Reaction score
9,010
Main Camera
iPhone
I'm reading the CHIPS Act summary. One portion allocates $50B to "to develop domestic manufacturing capability—and research and development (“R&D”) and workforce development programs authorized by the FY21 NDAA (Sec. 9902 & 9906)."

I assume that addresses building more fabs in the US. And am wondering what the current cost is for a state of the art fab. My recollection from the early-2000s (when I was at a small chip design company) that would have been around $4B. Today, I would guess that would be somewhere around 3-4 times that. Is that a reasonable assumption? And if so, that would be around 3 or 4 fabs?

Also in the summary: "The language would also re-affirm that the purchase of stocks and dividends are not an eligible use of CHIPS funds as determined by the eligible use of funds already required under the FY21 NDAA." Interesting.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,350
Reaction score
8,558
I'm reading the CHIPS Act summary. One portion allocates $50B to "to develop domestic manufacturing capability—and research and development (“R&D”) and workforce development programs authorized by the FY21 NDAA (Sec. 9902 & 9906)."

I assume that addresses building more fabs in the US. And am wondering what the current cost is for a state of the art fab. My recollection from the early-2000s (when I was at a small chip design company) that would have been around $4B. Today, I would guess that would be somewhere around 3-4 times that. Is that a reasonable assumption? And if so, that would be around 3 or 4 fabs?

Also in the summary: "The language would also re-affirm that the purchase of stocks and dividends are not an eligible use of CHIPS funds as determined by the eligible use of funds already required under the FY21 NDAA." Interesting.

A fab on the leading node would cost around $20B right now, probably, at least in the U.S.
 

Citysnaps

Elite Member
Staff Member
Site Donor
Posts
3,702
Reaction score
9,010
Main Camera
iPhone
A fab on the leading node would cost around $20B right now, probably, at least in the U.S.

Then that would amount to roughly two fabs. Is that enough to make a significant difference?
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,350
Reaction score
8,558
Then that would amount to roughly two fabs. Is that enough to make a significant difference?

I think the value-add of the bill is to compensate companies for the difference in cost between building in the US vs. building abroad. But even building one or two superfabs in the US would make quite a difference to the overall economy.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
Apparently the rumor going around is that Intel is highly dysfunctional, primarily at a managerial level, which has put a number of projects in jeopardy. The most public being the Arc GPU division. A couple of their friendly marketing guys have been making the rounds with the likes of Linus and Gamer's Nexus, putting a happy face on that product, despite delays. The problem seems to be that what they are saying publicly and saying to the other parts of Intel are two different things, and the GPU guys are doing this marketing push on their own, without consent from the rest of the company. Looks like an internal turf war going on, with those outside Arc claiming that they are lying about the readiness of the product to consumers and AIB partners.

Evidently, Gelsinger isn't happy about the disarray in the graphics department and there's going to be a full review done. They've got a short time to get their act together, or top management is going to cancel the discrete GPU, focusing on data center. Larrabee all over again. At least the i740 shipped as an actual mass market product, failure that it was.

Apparently, part of the first net loss that Intel has suffered in decades was due to a write down on the 4 million Arc dGPUs sitting in a warehouse, waiting for finished drivers. The problem appears to be more fundamental to the actual Alchemist silicon, and may end up killing any future plans going forward for discrete performance cards, in particular for gamers. Anybody remember the Matrox Parhelia?
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
I remember back when, at least on the outside, it was looking grim for AMD and they'd eventually be squashed by team blue. Well, last Friday, AMD surpassed Intel's market cap as they continue to execute well beyond their larger competitor. Everyone said that Pat Gelsinger was going to be Intel's Lisa Sue, but thus far, that is not the case. Maybe Intel is hoping all that sweet government bank is going to improve the situation, but from what I've heard, there are fundamental dysfunctional elements in every division of the company. That includes engineering, management, marketing and leadership, with substantial infighting between fiefdoms. I mentioned how the Arc GPU team is a mess, but other less public projects, like the Sapphire Rapids Xeon is also significantly delayed, and there's no excuse for that, since it's not a new product, planned to be fabbed on a mature node.

I certainly don't want Intel to fail. Despite his legitimate grievances over the company, I've never gotten the impression that @Cmaier does either. I think we just want them to, well, not suck at their jobs.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
Thus far, Meteor Lake, Sapphire Rapids, Ponte Vecchio, and Arc have all been delayed. That's Intel's entire product line, except for Raptor Lake, which is slightly behind schedule, will launch after Zen 4, and have no followup. Gelsinger may end up axing GPUs, because of the company's losses, and he doesn't want to be an ex-CEO. In an interview, Jon Peddie mentioned how Gelsinger was in charge of the i740 project, which Peddie consulted with him on, and GPUs are "very personal to Gelsinger". So, it does look like he's going to try to save Arc, but spending billions more may not make sense if it's eternally delayed.

Part of the problem is that Intel thought that they could leverage their iGPU drivers, but that fell flat because those aren't performance parts and nobody is using them for anything remotely intensive. Peddie also points out that, unlike the teams at AMD and Nvidia, the Intel Arc folks were just tossed together, many don't know each other, some even don't like each other. Intel is a giant company with many fiefdoms and infighting, with one part of the company being completely unaware of the goings on within other divisions, either because of bureaucracy or simple turf protectionism.

I get why the U.S. government is funneling government money into Intel and other chip makers. However, I have to wonder if we are witnessing the decline of a tech giant. Intel is never going to go out of business, at least not while we need semiconductors and the government props them up, but they are quickly resembling IBM. IBM still exists but they have no relevance in defining the future of the industry. Intel may be headed the same way; a foundry that's always a node behind, with an occasionally interesting product, but a leader at nothing.

I've heard some commentators say that Gelsinger "just needs time" and concerning their GPU efforts we "need to be fair to Intel" because it's their first mass market discrete GPU. That doesn't explain every other product falling behind. I don't want Intel to fail, they're doing that on their own. However, at this point, I'm starting to agree with @Cmaier. Intel simply doesn't deserve fairness because of their repeated, obvious failures.

Yet there are still people that somehow justify the idea that Apple should have stayed with Intel, which is a detachment from reality. Losing x86 may suck for them as individual users, but the Mac's gravity defying performance and revenue growth are undeniable.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,350
Reaction score
8,558
Thus far, Meteor Lake, Sapphire Rapids, Ponte Vecchio, and Arc have all been delayed. That's Intel's entire product line, except for Raptor Lake, which is slightly behind schedule, will launch after Zen 4, and have no followup. Gelsinger may end up axing GPUs, because of the company's losses, and he doesn't want to be an ex-CEO. In an interview, Jon Peddie mentioned how Gelsinger was in charge of the i740 project, which Peddie consulted with him on, and GPUs are "very personal to Gelsinger". So, it does look like he's going to try to save Arc, but spending billions more may not make sense if it's eternally delayed.

Part of the problem is that Intel thought that they could leverage their iGPU drivers, but that fell flat because those aren't performance parts and nobody is using them for anything remotely intensive. Peddie also points out that, unlike the teams at AMD and Nvidia, the Intel Arc folks were just tossed together, many don't know each other, some even don't like each other. Intel is a giant company with many fiefdoms and infighting, with one part of the company being completely unaware of the goings on within other divisions, either because of bureaucracy or simple turf protectionism.

I get why the U.S. government is funneling government money into Intel and other chip makers. However, I have to wonder if we are witnessing the decline of a tech giant. Intel is never going to go out of business, at least not while we need semiconductors and the government props them up, but they are quickly resembling IBM. IBM still exists but they have no relevance in defining the future of the industry. Intel may be headed the same way; a foundry that's always a node behind, with an occasionally interesting product, but a leader at nothing.

I've heard some commentators say that Gelsinger "just needs time" and concerning their GPU efforts we "need to be fair to Intel" because it's their first mass market discrete GPU. That doesn't explain every other product falling behind. I don't want Intel to fail, they're doing that on their own. However, at this point, I'm starting to agree with @Cmaier. Intel simply doesn't deserve fairness because of their repeated, obvious failures.

Yet there are still people that somehow justify the idea that Apple should have stayed with Intel, which is a detachment from reality. Losing x86 may suck for them as individual users, but the Mac's gravity defying performance and revenue growth are undeniable.

As someone who spent almost a decade competing with Intel, and who followed them before that and since, Intel’s value-added - the place where they had an advantage over the competitions - was their fabs. When I was at AMD Intel had something like 8 major fabs, compared to us having 1 or 2 (depending on time frame). And they had remarkable yields - sometimes double ours - and their processes outperformed ours at the same node. They were able to dedicate entire fabs to just trying to get yield and performance up, and when they figured out the recipe they quickly replicated it to all the rest of their fabs.

Nobody in the industry respected their design skills. They had some good designers in Oregon, but the rest were not good. Better design teams were all over the place - DEC, AMD, Exponential, RISE, IBM, etc. But nobody had those fabs.

I have no idea what happened to them and why they screwed up the fab side so badly, but if I were gelsinger I’d split the company in two, let someone else take over the “design” company, and focus on getting those fabs working again. There’s no reason they can’t compete successfully with TSMC if they put in the resources and focus on it.

As for their GPUs and CPUs, well, if they are made using the same fab technology that nvidia and AMD have access to, I don’t see how Intel will ever be able to compete. If you are a great CPU designer why on earth would you want to work at Intel? They don’t pay that great, you don’t get to work on fun things, you own a small portion of the chip and have to deal with a ton of bureaucracy - why wouldn’t you go work someplace else? The motivating factor for every great designer I knew was they wanted to start with a clean sheet of paper and own as much of the design as they could. My first CPU design job I got to own half the chip by area. At Sun I got to start with a clean sheet of paper and own the entire instruction scheduler. At AMD I got to be co-owner of the entire floor plan, owned entire units like the FP, the integer pipelines, the scheduler, etc., got to own the design methodology, and even got to come up with part of a new instruction set. Every single person I ever interviewed from Intel owned the same unit for their entire time there, and it was always a tiny thing like “wire routing channel number 4.” They only did a small part of the job - just the physical design or just the logic design or just the circuit design. You simply aren’t going to get the best designers if you aren’t going to let them be challenged, unless you overpay them tremendously (and even then, they won’t stay long).

Anyway, I digress.

Wake me when Intel is competing with TSMC on fab nodes.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
As someone who spent almost a decade competing with Intel, and who followed them before that and since, Intel’s value-added - the place where they had an advantage over the competitions - was their fabs.
"Only real men have fabs."

I've been meaning to ask, did you ever meet Jerry Sanders while you were at AMD? He seemed like a real character.

I've read that a printout was given to each worker who started a job at AMD in Dresden, until Sanders's retirement, that read, "People first, products and profit will follow!"
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,350
Reaction score
8,558
"Only real men have fabs."

I've been meaning to ask, did you ever meet Jerry Sanders while you were at AMD? He seemed like a real character.

I've read that a printout was given to each worker who started a job at AMD in Dresden, until Sanders's retirement, that read, "People first, products and profit will follow!"

I did meet Jerry a few times. He wore a suit where the pin stripes were actually the words “Jerry Sanders Jerry Sanders” repeated over and over in tiny type. One time he descended from the rafters of the San Jose arena in a harness at a concert/party they held for all the employees (with faith hill performing).

At one point morale was really low and lots of people quit. We had gotten a bunch of stock options but the price was $42/share and the stocks was trading much lower than that (I can’t remember how low, but probably single digits) and people were generally unhappy. So he shows up to cheer us up. He says something like “hey, we’re all feeling it. My wife wants to buy new furniture for the mansion but I had to tell her no because the stock price is so low.”

Needless to say, that didn’t help anything.
 

B01L

SlackMaster
Posts
176
Reaction score
132
Location
Diagonally parked in a parallel universe...
...“hey, we’re all feeling it. My wife wants to buy new furniture for the mansion but I had to tell her no because the stock price is so low.”

10 dollar banana.jpg
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
I thought I'd mention the latest earnings roundup, starting with Gelsinger's stewardship of Intel, courtesy of Anandtech. Year-over-year, team blue is down by $3.9 billion in revenue, while having an operating loss of $175 million. (Technically, they were in the black, thanks to a tax benefit.) It's not as bad as it was, but hardly the big turnaround that a lot of analysts had expected after Gelsinger was named CEO.

Screenshot 2022-10-29 at 11.03.31 PM.jpg


It's notable that the Client Computing group was down by -17%.

Now, for comparison, let us check in with AMD, and Lisa Su's endeavors. Preliminary results, again courtesy of Anandtech. Unfortunately for team red, they missed guidance by $1.1 billion.

Screenshot 2022-10-29 at 11.13.29 PM.jpg


The massive embedded gains are from the Xilinx merger. Again, notable is the Client group, taking a massive hit at -40%, even worse than Intel.

While we don't have Jensen's team green results yet for this quarter, last quarter they missed forecasts by $1 billion, matching their competitors in the industry, with significantly impacted sales.

I'm sure that most of us are already aware that the fruit company also released results, this time courtesy of Jason Snell over at Six Colors, with many useful charts. Unlike others in the tech industry, Apple beat analyst forecasts, with $90.15 billion in revenue, and $20.7 billion in profit.

Screenshot 2022-10-29 at 11.20.45 PM.jpg

The big standout for the quarter were Mac revenues, hitting an all-time high, at $11.5 billion.

Screenshot 2022-10-29 at 11.22.22 PM.jpg

In which the 38-year old platform achieved 25% year-over-year growth.

Screenshot 2022-10-29 at 11.23.02 PM.jpg


The reason I highlight the Mac is because most of the transition to Apple Silicon is complete, excluding the low-volume Mac Pro and a single token Intel Mac mini. That means that most of the early adopters have already made the switch, so a mad dash among the hardcore users is not responsible for that growth. While only a single data point, according to Steam, Apple Silicon should surpass Intel among Mac gamers this month.

Screenshot 2022-10-29 at 11.26.17 PM.jpg


According to Tim Cook, nearly half of Mac purchases went to customers that are new to the Mac.

When it was first announced, there were a lot of Mac users that were upset by the switch to Apple Silicon, claiming that the risk of dropping Intel and losing x86 compatibility would be the doom of the platform, but that appears to be opposite. Switching to AMD wouldn't have likely changed the equation. I highlighted the specific results above because of these basic revenue statistics:

Intel's Client group: -17%
AMD's Client group: -40%
Mac sales growth: +25%

I think it's fair to say that Apple proved its detractors wrong and that the transition to Apple Silicon was an extraordinary success. It's amazing how many times I hear "Apple can't do", and then they go and do it.

Finally, we've often discussed how power consumption is getting out of control, and that thanks to the switch to Apple Silicon, the Mac is the only platform that is keeping that under control. Recently, AMD admitted that power usage is increasing faster than process advancements can make up for. According to Sam Naffziger, AMD Vice President of Product Technology Architecture:

Screenshot 2022-10-29 at 11.43.33 PM.jpg


Intel, AMD and Nvidia are fighting each other into a death spiral that's eventually going to catch up with them. As long as they win by 2% in benchmarks, power usage be damned, while hiding behind fake fairytale power consumption specifications. Considering the performance of Resident Evil Village on Apple Silicon, that one last bastion of PCs may have had its defenses broken. For the foreseeable future, the PC guys are going to continue to push the hardware with ever increasing wattage, until something breaks.
 

Attachments

  • Screenshot 2022-10-29 at 11.20.45 PM.jpg
    Screenshot 2022-10-29 at 11.20.45 PM.jpg
    52.2 KB · Views: 19
  • Screenshot 2022-10-29 at 11.20.45 PM.jpg
    Screenshot 2022-10-29 at 11.20.45 PM.jpg
    52.2 KB · Views: 19

dada_dave

Elite Member
Posts
2,174
Reaction score
2,171
This is old and slightly unfair as he isn’t the only major technology figure to make predictions that aged poorly but here is Gelsinger in 2008 saying CUDA and GPU compute in general would be nothing more than curiosities in the annals of computer history.

1680798903751.png



 
Top Bottom
1 2