Nuvia: don’t hold your breath

Correct me if I'm wrong, but wasn't Nuvia like three guys? How many engineers, roughly speaking, are working on Apple Silicon? From my purely amateur observations, these are large projects, where any one individual, or handful of individuals, aren't vital to a design implementation. (Except for Jim Keller, because the internet says he is chip god.)
there were three main guys, but I’ve seen discussion of a couple dozen or more lower-level people having gone. One of the three guys I know really well having worked alongside him for years. I read very interesting things about him in Apple’s legal complaint re: trade secrets :-)
 
I might be wrong, but my sense is Qualcomm's glory days have long passed. They have a ton of communications-related patents having written the book on modern communications signal processing and error correction, and no doubt still snag decent licensing fees. But it has been a loooong time since Drs. Viterbi and Jacobs, the real brain trust, ran the company.

I have no idea. I think that if there’s a market for “Intel, but for Arm” then they are well-situated. I just don’t think there will be another “intel” model in the semiconductor industry. Many of the biggest companies that sell end products (cars, computers, phones) will do their own CPU designs and use TSMC or Samsung or Intel to fab them. The smaller companies will be customers, but Qualcomm will have to compete with other Arm chip designers (Marvel, Samsung, Intel, maybe AMD someday, nVidia, etc.). Qualcomm still has a stranglehold on radio chipsets, though there are signs of small cracks in that, too.

Qualcomm has at least been smart enough to realize that it needs to leverage its radio stranglehold to get into other markets before it’s too late.
 
there were three main guys, but I’ve seen discussion of a couple dozen or more lower-level people having gone.
So, aside from the underlings, and your trade secrets friend, most/many of the folks still at Apple are top-shelf engineers?
 
I have no idea. I think that if there’s a market for “Intel, but for Arm” then they are well-situated. I just don’t think there will be another “intel” model in the semiconductor industry. Many of the biggest companies that sell end products (cars, computers, phones) will do their own CPU designs and use TSMC or Samsung or Intel to fab them. The smaller companies will be customers, but Qualcomm will have to compete with other Arm chip designers (Marvel, Samsung, Intel, maybe AMD someday, nVidia, etc.). Qualcomm still has a stranglehold on radio chipsets, though there are signs of small cracks in that, too.

Qualcomm has at least been smart enough to realize that it needs to leverage its radio stranglehold to get into other markets before it’s too late.

What I'm wondering, being out of the digital communications signal processing ASIC field for awhile, is who is pioneering developments in that area. Or is that handled within much larger companies like Apple and Samsung (or Ericsson/Nokia) and you just don't hear about interesting breakthroughs in that area as it's embedded in their tech. We had to tread very lightly on some aspects such as error correction techniques and power amplifier digital pre-distortion and linearization knowing that it would be easy for a large company to squash a 10 person company.
 
So, aside from the underlings, and your trade secrets friend, most/many of the folks still at Apple are top-shelf engineers?

I would think so. I see 10 folks on my linkedin connections who I used to work with who are at Apple. Director of Custom Silicon Management - this guy was an incredibly talented circuit designer who worked at DEC on the Alpha before i worked with him for years. Senior CAD engineer - became my boss for a few minutes and took over most of my work when I left. Senior CAD engineer - design verification manager. I worked with this guy at Exponential, when there were two engineers who had to do the verification for the whole chip. I worked with him to debug a nasty wire coupling bug, and I remember printing out giant schematics and manually working out how to generate a test vector to determine whether our theory was right. Another engineer who worked for me on several chips. A power optimization lead. I worked with him all the way back on K6, and then he moved to Austin. Very smart guy, though I recall a fun story where we were at a conference together, and another guy we know was talking to me. The guy who is now at Apple came up and offered his opinion on a technical matter, and the other guy looks at his badge, doesn’t see a “Dr.” In front of the name, and says ”well, I have a ph.d. to back up MY opinion.” Fun.

Oh, this is fun! I see the *other* of the two design verification guys from Exponential is *also* at Apple!

Ah, another guy who lists his job title as “CAD monkey.” He took over my EDA tools when I left - so it was left to him to figure out my spaghetti code that performed circuit classification. I didn’t work too much with him, but if he figured out how to keep my code working he must be talented.

CPU Implementation Lead. He started at AMD a few years before me, and was one of the first people I saw go to Apple. He’s been there a very long time.

Anyway, these are all very experienced folks who know what they are doing.
 
What I'm wondering, being out of the digital communications signal processing ASIC field for awhile, is who is pioneering developments in that area. Or is that handled within much larger companies like Apple and Samsung (or Ericsson/Nokia) and you just don't hear about interesting breakthroughs in that area as it's embedded in their tech. We had to tread very lightly on some aspects such as error correction techniques and power amplifier digital pre-distortion and linearization knowing that it would be easy for a large company to squash a 10 person company.

yeah, i think that progress is pretty spread around right now. Convolutional coding and viterbi coding was a breakthrough, and then along came some MIMO advancements, and most of the time things are just incremental. I’m pretty sure qualcomm still has a huge concentration of the engineers who come up with the stuff that ends up in the standards, but if you look at the standards declarations, lots of companies have engineers who contribute. I know that Apple has been accelerating that area rapidly, but it’s still nowhere near qualcomm.
 
Ah, interesting. I did a linked in filter on Nuvia. So aside from their VP of engineering (at qualcomm), who is the trade secrets guy I mentioned, another guy who was essentially a peer of mine at AMD works there (as a member of technical staff, which I would imagine is not where he’d want to be at this point in his career). He was pretty good. At least as good as the VP, by my recollection. But I’d much rather have the folks I listed from Apple.

Keep in mind, it’s been many years since I worked with these people. Maybe the weaker ones got strong and the stronger ones had head injuries. I don’t know.
 
yeah, i think that progress is pretty spread around right now. Convolutional coding and viterbi coding was a breakthrough, and then along came some MIMO advancements, and most of the time things are just incremental. I’m pretty sure qualcomm still has a huge concentration of the engineers who come up with the stuff that ends up in the standards, but if you look at the standards declarations, lots of companies have engineers who contribute. I know that Apple has been accelerating that area rapidly, but it’s still nowhere near qualcomm.

There was a company in San Jose called ArrayComm that pioneered a lot beamforming and MIMO advancements. Interestingly it was founded by Martin Cooper, who is credited with coming up with cell-based wireless telephony at Motorola long ago. We talked with them at one time about collaboration possibilities developing digital beamforming ASICs for telecom and other applications. The good news was that didn't go anywhere as there was a principle engineer there who some worked with at a previous company that would have been difficult to get along with.

We also had extensive discussions with Altera and Xilinx who wanted to get in the digital communications processing space for commercial and defense applications. They were stymied trying to implement digital filters, down converters/upconverters for radios and couldn't figure out why their FPGA designs were much more than an order of magnitude less in performance than our ASICS. And FAR worse on power dissipation. We passed on working with them knowing they just wanted to snag our architecture tricks.

We did collaborate with National, them having some interesting high performanc/speed/high bit width ADCs and DACs, but didn't have the the digital communication/radio signal processing chops, or the customer breadth.

We finally landed at TI, with pros and cons.
 
Oh, this is fun! I see the *other* of the two design verification guys from Exponential is *also* at Apple!
I'm glad to see that you're enthused about your old colleagues. I find these stories to be quite fascinating. From the outside, there wasn't a lot of information in the general tech press about many of the companies that you've mentioned or worked at. NexGen just appeared out of nowhere, and was bought by AMD. Cyrix was always puttering around the mid-card, Centaur lower than that, until VIA bought both and everyone left. I recall Exponential being a thing that existed, but for the life of me I couldn't tell you what you were working on. Transmeta was a bizarre little endeavor.

Like Grendel from the Anglo-Saxon epic Beowulf, the DEC Alpha was much talked about, but rarely seen.

No offense about your work on the K6, but I didn't pick up my first AMD CPU until the Athlon XP. I fried it with too much voltage over a series of months, then replaced it with a Northwood P4, because you could get rediculous clocks of of them.

That was back when I was young and stupid and only cared about overclocking and how many FPS I could get out of Quake, even though I didn't play Quake. Now that I've grown to be old and stupid, I've decided to let Apple handle all of the details for me. Last year, I upgraded the system memory inside my Mac mini from 8GB to 64GB, and found it to be a tedious, laborious process. I'd rather just buy the whole widget, not having to worry about BIOS settings, activating Windows, anti-virus, privacy invading bloatware, or any of the other PC shit, and just let Apple do it for me. The Mac is a superior product on most every level, in my opinion, so it's an easy choice.

Still, I appreciate your smoke pit stories, @Cmaier.
 
I'm glad to see that you're enthused about your old colleagues. I find these stories to be quite fascinating. From the outside, there wasn't a lot of information in the general tech press about many of the companies that you've mentioned or worked at. NexGen just appeared out of nowhere, and was bought by AMD. Cyrix was always puttering around the mid-card, Centaur lower than that, until VIA bought both and everyone left. I recall Exponential being a thing that existed, but for the life of me I couldn't tell you what you were working on. Transmeta was a bizarre little endeavor.

Like Grendel from the Anglo-Saxon epic Beowulf, the DEC Alpha was much talked about, but rarely seen.

No offense about your work on the K6, but I didn't pick up my first AMD CPU until the Athlon XP. I fried it with too much voltage over a series of months, then replaced it with a Northwood P4, because you could get rediculous clocks of of them.

That was back when I was young and stupid and only cared about overclocking and how many FPS I could get out of Quake, even though I didn't play Quake. Now that I've grown to be old and stupid, I've decided to let Apple handle all of the details for me. Last year, I upgraded the system memory inside my Mac mini from 8GB to 64GB, and found it to be a tedious, laborious process. I'd rather just buy the whole widget, not having to worry about BIOS settings, activating Windows, anti-virus, privacy invading bloatware, or any of the other PC shit, and just let Apple do it for me. The Mac is a superior product on most every level, in my opinion, so it's an easy choice.

Still, I appreciate your smoke pit stories, @Cmaier.
This is what we did at Exponential. In the photo in the bio section I appear to be 12 years old.
 

Attachments

This is what we did at Exponential. In the photo in the bio section I appear to be 12 years old.
Thanks for the document, taking a gander at it, some of that is coming back to me. And yeah, you're just a youngin' in that photograph.

Aside from the minutia of semiconductor design, I think these stories are an important reminder that humans are responsible for them. To the general public, computer chips are magic, Wifi is wizardy, the internet is illusory. It's been trendy, in recent times, for futurists and science fiction authors to use this perception to their advantage. One supposed explanation for the Fermi Paradox is the artificial intelligence singularity. In other words, a sufficiently advanced civilization will be undone by its own creations. It's as if the end result will be a series of self-replicating Von Neumannn Probes that will efficiently turn the Earth, and hence us, into an endless supply of paperclips.

What's lost in all of this is that computer chips are designed by humans. AI software is written by humans. It's not going to be like some bizarre intelligence visiting from Proxima-Centauri b, but will have the fingerprints of humanity all over it.

Sorry about the apocalyptic tangent, but your tales are valuable, if for no other reason as a reminder that technology is very much a human endeavor.
 
Apple has dropped its lawsuit against former chip architect Gerrard Williams. Apple accused him of breach of contract and poaching, after he left for Nuvia, now part of Qualcomm.


Currently, the reasons for Apple dropping the lawsuit aren't publicly known.

As an aside, I've always been suspicious that Mark Gurman's main source for Apple leaks is somebody who works at Nuvia. That would explain why he is becoming less accurate over time. With the M1, Gurman had codenames, specs, all the details. The M2 is less certain, and more vague. If Gurman's best source is an ex-Apple employee who is taking revenge through Bloomberg, then Gurman's accuracy will continue to decline, as that source's useful information runs dry. They would know the roadmap, but not have the most current details, with Gurman filling in the rest with his own opinion while wearing his pundit hat. This is entirely speculation, on my part, of course.
 
Apple has dropped its lawsuit against former chip architect Gerrard Williams. Apple accused him of breach of contract and poaching, after he left for Nuvia, now part of Qualcomm.


Currently, the reasons for Apple dropping the lawsuit aren't publicly known.

Interesting.

As an aside, I've always been suspicious that Mark Gurman's main source for Apple leaks is somebody who works at Nuvia. That would explain why he is becoming less accurate over time. With the M1, Gurman had codenames, specs, all the details. The M2 is less certain, and more vague. If Gurman's best source is an ex-Apple employee who is taking revenge through Bloomberg, then Gurman's accuracy will continue to decline, as that source's useful information runs dry. They would know the roadmap, but not have the most current details, with Gurman filling in the rest with his own opinion while wearing his pundit hat. This is entirely speculation, on my part, of course.

I had though Gurman’s main source is known, people were pretty sure (perhaps erroneously see below) it was this guy:


I mean they never say Gurman’s name, only Wayne Ma’s. But if he also leaked to Gurman that would fit with your logic too given the timeline. However now I see that is also speculation as again only Wayne Ma was mentioned so maybe Gurman had (or has?) another source entirely and it’s just coincidence. After all it would be odd if only Wayne Ma was called out and they knew Gurman was getting this info too.
 
Last edited:
However now I see that is also speculation as again only Wayne Ma was mentioned so maybe Gurman had (or has?) another source entirely and it’s just coincidence. After all it would be odd if only Wayne Ma was called out and they knew Gurman was getting this info too.
I think Gurman has more than one source, but he burned his best one, and is now running off fumes. I don't know exactly who that was, but his reporting has gotten more ramshackle as time has gone on, mixing punditry with facts. I think there's also a good chance that the "Extreme" chip was either an aspirational design that never made it past the drawing board, or a misdirection from either the leaker, or Gurman himself, to protect sources and methods. It just seems odd to me that Gurman was bang on about the entire M1/M2 specs, except for the "Extreme", which is the obvious outlier. We have extensive technical evidence that it never existed beyond the pages of Gurman's articles.

Regardless, given @Cmaier's silence on the matter, that tells me that Gurman's roadmap is wonky and there's some hokum afoot.
 
I think Gurman has more than one source, but he burned his best one, and is now running off fumes. I don't know exactly who that was, but his reporting has gotten more ramshackle as time has gone on, mixing punditry with facts. I think there's also a good chance that the "Extreme" chip was either an aspirational design that never made it past the drawing board, or a misdirection from either the leaker, or Gurman himself, to protect sources and methods. It just seems odd to me that Gurman was bang on about the entire M1/M2 specs, except for the "Extreme", which is the obvious outlier. We have extensive technical evidence that it never existed beyond the pages of Gurman's articles.

Regardless, given @Cmaier's silence on the matter, that tells me that Gurman's roadmap is wonky and there's some hokum afoot.

Today I’m quiet just because I’m at a bat mitzvah.

I’m sure Qualcomm got the case settled by giving apple a supply payment offset, and apple isn’t all that worried about nuvia beating their work at this point anyway.
 
Today I’m quiet just because I’m at a bat mitzvah.
I meant you're quiet about the future of Apple Silicon in general, because you hoodwinked one of your old buddies into telling you about the goods. You're like a little kid who can't wait to open his birthday presents and ends up with cake frosting and wrapping paper all over his face, while claiming he didn't do anything. That negates your ability to speculate publicly on future Apple Silicon products, because you've got the skinny on the good shit. Obviously, you're not going to betray your sources, nor should we ask you to do so. Unfortunately, that means my co-host on this forum is currently hobbled by highly classified information. You wouldn't be so quiet if Gurman were correct, otherwise you would simply play along. I just hope you don't know about multiple years of Apple Silicon plans, because then you'd be as helpful as myself if I'm dumb enough to switch sides and go PC, that is. You out of confidence, me out of stupidity.

That being said, if I were in your position, then I'd do the exact same thing. Hell yes, indeed.
 
I meant you're quiet about the future of Apple Silicon in general, because you hoodwinked one of your old buddies into telling you about the goods. You're like a little kid who can't wait to open his birthday presents and ends up with cake frosting and wrapping paper all over his face, while claiming he didn't do anything. That negates your ability to speculate publicly on future Apple Silicon products, because you've got the skinny on the good shit. Obviously, you're not going to betray your sources, nor should we ask you to do so. Unfortunately, that means my co-host on this forum is currently hobbled by highly classified information. You wouldn't be so quiet if Gurman were correct, otherwise you would simply play along. I just hope you don't know about multiple years of Apple Silicon plans, because then you'd be as helpful as myself if I'm dumb enough to switch sides and go PC, that is. You out of confidence, me out of stupidity.

That being said, if I were in your position, then I'd do the exact same thing. Hell yes, indeed.

I wouldn’t draw any conclusions. I’m very confusing.
 
My feeling is that Apple should be working to promote and advance ARM designs for everyone. If other systems transition away from x86 sooner, there will be better access to a wider range of software. As long as they can retain the top dog position, having more ARM in the ecosystem can only be good for Apple.
 
My feeling is that Apple should be working to promote and advance ARM designs for everyone. If other systems transition away from x86 sooner, there will be better access to a wider range of software. As long as they can retain the top dog position, having more ARM in the ecosystem can only be good for Apple.
I think the problem with the "future is Arm" argument is and always be Microsoft and Windows. From what I gather, running x86 programs on WOA is suboptimal. Plus, there are all sorts of niche programs, and of course games, that are highly specific to x86 optimizations. All of that can be emulated, but I don't think the power savings and speed improvement in Arm are enough to switch an entire dinosaur codebase. With macOS and Linux, sure, that makes sense and is compartively easy. I'm not convinced about the Windows hegemony. Plus, Intel and AMD have a lot of incentives and resources to make sure that doesn't happen.

This isn't specific to Arm, but this short LTT video goes into detail why Windows' progress moves slower than molasses syrup in winter. It's not just organizations with legacy applications, but the comments section is full of folks who want Windows set in an immutable state. There's also a strong mindshare that x86 has among traditional PCs which isn't easy to displace.



Plus, we've seen this rodeo before. Windows NT supported PPC, MIPS, Alpha and x86. Only Alpha doggedly stuck around, and no matter how good FX!32 emulation was, and Alpha being a superior design, it didn't dislodge the old x86 dog. I don't see why that would change with this endeavor. It's why I really don't see Nuvia/Qualcomm's chips going anywhere, not simply because I think Apple has a head start, but because I don't think the average Windows PC user is going to want it, even if they don't know the difference between x86 and Arm. If I do the big switch to PC, you wouldn't catch me dead with an Arm design, but it really wouldn't matter, because I would be dead.
 
Back
Top