Nuvia: don’t hold your breath

The name Nuvia is kind of interesting to me because I recall having read about a company about a decade ago called Via that had a hybrid chip that could natively run either ARM or x86. 32-bit, of course. I wonder if there is any vague link (by people, because Via was bought up by one of the big guys).
 
In my final years at AMD, I was in charge of design methodology. I wrote a lot of our tools. If I was still in the industry, I absolutely would be working on getting AI to replace me. I think it can be done, but only if the people working on the tool truly understand how design works, including all the physics that can get you in trouble.

Heck, there are days when I wake up and think “I should write that tool” even now.
I think Google’s and at least one of the chip software companies released AI layout tools but so far anyway I don’t think they were covering all aspects of the design space. So people definitely are working on it. I’m not sure which techniques they are using or how deeply the learning gets into fundamentals (ie which layers of rules - all the way down to measuring quantum effects? - the algorithms are being trained on).
 
The name Nuvia is kind of interesting to me because I recall having read about a company about a decade ago called Via that had a hybrid chip that could natively run either ARM or x86. 32-bit, of course. I wonder if there is any vague link (by people, because Via was bought up by one of the big guys).
Exponential originally was going to be a hybrid x86/powerpc chip. Essentially an x86 front end and using powerpc as the microcode. There were still bits of x86 detritus in it when we taped out.

If I remember correctly, the reason we started the Austin office was so that they could focus on resurrecting that, and not be dragged into our day-to-day work on the pure powerpc. That Austin office became EVSX around the time we were going out of business. EVSX renamed as Intrinsity, which was purchased by Apple early on in their A-series work. I knew those guys very well, since many of them were relocated from the cubicles surrounding mine.

Anyway, Via was a pretty major company. They made x86 clones, among other things - I think they bought that business from National Semiconductor, but I may be misremembering. Remember that AMD was just one of many similarly situated x86 purveyors for a long time (Cyrix, RISE, IDT, Transmeta, and even IBM had a stealth project in Vermont, etc.); it wasn’t until AMD bought Nexgen and the Nexgen guys ripped up their design methodology that AMD started to rise above the other competitors.
 
I just looked at the list of VIA processor cores, and one at the bottom says it supports MMX, 5 versions of SSE, SHA, AES, FMA3 and 10 AVXs. I mean, I was hearing Neon was clunky or something, but that is one metric shit-ton of add-ons to x86.
 
Here's the thing, I have a higher opinion of Ballmer than some here, but I got a bit of view into his management style and saw why he wasn't being effective in the role. And it wasn't because of Ballmer's lack of vision. He may have been a cheerleader, but he was smart enough to know that persona is not for making decisions with when it comes to the future of the company.

Some of those reasons Ballmer wasn't effective are still present in the culture of the company. Namely the fiefdoms issue, and the whole idea of bottom-up management. This allows a giant company to act in so many different ways that a company run by a controlling perfectionist like Jobs would never be able to do, but it also means the company itself can be its own worst enemy, and you have VPs that will protect their own vision against other groups and even senior leadership at times where maybe they really shouldn't.
Good perspective - I forgot you had an inside view.

Corporate cultures can be changed, but it takes exceptional circumstances. The example which comes to mind is actually Apple and Jobs - Apple had a huge fiefdom problem in the 1990s, and Jobs had to deal with it when NeXT reverse-acquired Apple. But you don't get the political capital to rock the boat that much unless the company's in trouble (which Microsoft isn't) and you're a figure like Steve Jobs (which Nadella isn't).
 
I just looked at the list of VIA processor cores, and one at the bottom says it supports MMX, 5 versions of SSE, SHA, AES, FMA3 and 10 AVXs. I mean, I was hearing Neon was clunky or something, but that is one metric shit-ton of add-ons to x86.
Intel absolutely loves coming up with piecemeal ideas for improving these kinds of instructions, giving each new extension its own 'feature supported' bit, and shipping processor cores which don't support random features forever since that helps adoption of new features so much.

My impression of Neon is that it isn't clunky at all, TBH. Looks very sane. But I have never written code for it so I could be missing a lot in my skimming.
 
The example which comes to mind is actually Apple and Jobs - Apple had a huge fiefdom problem in the 1990s, and Jobs had to deal with it when NeXT reverse-acquired Apple. But you don't get the political capital to rock the boat that much unless the company's in trouble (which Microsoft isn't) and you're a figure like Steve Jobs (which Nadella isn't).
NeXT's acquisition of Apple is one of the major inflection points in the tech industry. It eventually resulted in Michael Dell's most quotable quote, when asked what advice he'd give to Steve Jobs: "Shut it down and give the money back to the shareholders".

It sits alongside Palm's Ed Colligan concerning the iPhone: "We've learned and struggled for a few years here figuring out how to make a decent phone. PC guys are not going to just figure this out. They’re not going to just walk in".

There are an endless supply of statements from Apple's detractors over the years, predicting the imminent downfall of any and all of the fruit company's products and services. One of the most recent being Pat Gelsinger calling Apple a "lifestyle company", meanwhile Intel rehashes Justin Long's character from the archaic "I'm a Mac, and I'm a PC" adverts, showing that Intel is definitely keeping up with the times. Thing is, I think Steve Jobs might have taken the lifestyle remark as a compliment.

That is always going to be the great "what if" scenario, if Jobs had survived. The Apple of today is different than it was back when Jobs was CEO; it's much larger with additional pressures. He didn't care for the day-to-day operations of dividends and buybacks, nor the patience for politics required when running the world's most prominent and profitable tech company. Tim Cook was hired by Steve Jobs for his abilities in wrangling the supply chain and herding Apple's fickle suppliers. However, a modern tech CEO has to be comfortable as a skilled political negotiator, while walking a tightrope, not just with both major political parties in the United States, but around the world, as well.

I'm not certain that Steve Jobs would have had the patience to chart a passage through such stormy waters; his mercurial nature was in opposition to such circumstances. Also, Jobs was always on the hunt for the next big thing, which sometimes meant that older product lines would stagnate for far too long. The Mac had occasional moments of apparent lifelessness under Tim Cook, but Jobs seemed to become disinterested after the switch to Intel. I'm not certain whether the Apple Silicon endeavor would have been implemented under Steve Jobs, but fortune favors the bold, something that I believe Cook learned from Jobs, but he applies it in a more measured, judicious manner. How many times have we heard critics say "Apple can't do..." and then they go and do it?

I think Tim Cook deserves credit for putting considerable resources into an overhaul for the company's eldest product. Staying with Intel was the safe bet, Apple Silicon could have backfired for many reasons, yet has been an astounding success, allowing the Mac to outshine the mediocre offerings from the PC companies.

I suspect that if Steve Jobs had survived longer, he would have taken on a CTO role, while handing over the day-to-day operations to Cook. Still, that's entirely speculation on my part. Despite the Apple is Doomed™ and Tim Cook needs to be fired crowd over at MR, I think it's remarkable that Apple has been as successful as it has been, despite losing its genius co-founder, while facing strong headwinds from competitors, global events, government regulators, and endless frivolous lawsuits. I think Cook understands his limitations and delegates to employees who excel in those fields, while fostering a cooperative atmosphere that prevents the formation of fiefdoms. Apple is by no means perfect, but Cook seems suited to the role handed to him, and his leadership is a likely reason that Apple has continued to prosper as it grew in size and power.

In an alternate timeline, they purchased BeOS to replace classic Mac OS, and Apple was subsequently acquired by IBM. That would make for a much different technological landscape.
 
Nuvia/Qualcomm is all well and good but... you just know that no android or windows OEM is going to use them in the volume Apple is stuffing high end parts into, and even if they try, where is the manufacturing capacity coming from?
 
There are an endless supply of statements from Apple's detractors over the years, predicting the imminent downfall of any and all of the fruit company's products and services. One of the most recent being Pat Gelsinger calling Apple a "lifestyle company", meanwhile Intel rehashes Justin Long's character from the archaic "I'm a Mac, and I'm a PC" adverts, showing that Intel is definitely keeping up with the times. Thing is, I think Steve Jobs might have taken the lifestyle remark as a compliment.

I think the biggest, most important thing that separates apple from the rest is covered in the Steve Jobs biography:

The management at Apple want to make products that they themselves actually want to use. Now whether YOUR desires line up exactly with theirs or not is a thing, but I know mine align MUCH more closely with theirs than any other management team.

As a result, they determine what product they want and then go about designing a cohesive device (that integrates into the rest of their digital life/ecosystem), and then figure out how to make it. Rather than raiding the third party parts bin and trying to slot it into a market segment amongst 15 other devices they already sell at different price points to cover every base (see: Samsung).

Sometimes Apple miss the mark with the execution, but the INTENT is 100% completely and utterly different from the rest of the industry. Whilst the left hand of Microsoft or Samsung doesn't know what the right hand is doing, all of Apple's teams work together to ensure that their product stack MAKES SENSE.

Additionally, I think this culture of making things they want to use themselves is pervasive and heavily ingrained within Apple due to the policies and company culture instilled by Steve. Staff will come and go, but the model has proven to work and the remaining team members will continue to install this into new hires.

Steve was a genius in the way he instilled this sort of work-pride and ambition into the company but I don't think his passing will have had anywhere near as much of a negative effect as some think. His work was done and his legacy assured.

I'm 100% sure the Apple silicon transition was well underway as a contingency plan at least since the early iPhones/iPads because to anyone paying attention it was 100% clear that the ARM based designs they were putting out were on a course to rocket past x86 within a few product cycles in terms of efficiency and then eventually outright performance. I know I'd been getting much better portable performance (in terms of user experience for accomplishing a task - not benchmarking) out of iOS devices than mobile PC platforms for almost a decade now.

Additionally, Apple had been burned and had to go through CPU transitions multiple times in the past. If some new cpu manufacturer comes out with a revolutionary product that's on the open market, you can guarantee Apple will be running their product stack on it internally until they can either replicate its performance or beat it - or buy it.
 
Last edited:
I think this is what some people get and some people don’t.

Most people don’t want a computer per se. They just want something to do the things they need to do.

For many an ipad is enough. Or a phone. In the future maybe a set of glasses with gesture control.

Eventually i think for most people traditional computing will die and be replaced by wearables. Because if they can do the things you need a computer for while you’re out and about doing real world things… much better.
 
Eventually i think for most people traditional computing will die and be replaced by wearables. Because if they can do the things you need a computer for while you’re out and about doing real world things… much better.

A lot of this depends on how well interfaces evolve. Will some LLM like ChatGPT make it possible to get reliable interaction with an interface without heavy reliance on keyboard, mouse, or touch? If so, I could see wearables take off, so long as people don’t go mad with all the chattering everyone is doing with their devices, or people waving around in front of themselves trying to use gesture-based inputs.

There’s some advantages to a smartphone sized screen using touch that makes it something that isn’t disruptive in public, but not too large to be cumbersome and not too small to be hard to use for most things (i.e. Apple Watch). The whole trend towards Phablets and OEMs looking at foldable devices to fit ever larger screens seems to point in this direction for now. So call me a bit skeptical that wearables are the future primary compute device people use, at least for now.
 
Decades ago, someone developed an accessibilty mouse for quadriplegics that moved the cursor based on where you were looking (tracked your gaze). If Apple is developing AR glasses, perhaps they will include alternative interface sensors that can accumulate gestures that do not require waving your hands around.
 
It sits alongside Palm's Ed Colligan concerning the iPhone: "We've learned and struggled for a few years here figuring out how to make a decent phone. PC guys are not going to just figure this out. They’re not going to just walk in".

There is some (perhaps a lot of) truth to that. Not with respect to reimagining what a cell phone could be and how people would interact with it, but more so with meeting the technical requirements of various air interface standards (AMPS, GSM, CDMA, W-CDMA, UMTS, 5G, etc).

There's a fine art in interpreting air interface requirements to one's advantage - which is often required. Apple had the drive, imagination, and ability to start with a clean sheet to nail novel user interfaces. But... unlike Motorola/Ericsson/Nokia (collectively known as MEN in the industry) who pretty much wrote the book on cellular telephony, along with the technical requirements, Apple had no experience in that area.

I believe that changed with Apple collaborating with Motorola on the ROKR cellular phone, which preceded the iPhone. Working side-by-side with Motorola no doubt helped Apple come up to speed with respect to understanding technical air interface standard requirements, how to correctly interpret them, and create designs that would .

The ROKR (iirc) was pretty much a flop - and that was OK. IMO, and that's subject to debate, one of Jobs' shrewdest moves was collaborating with Moto, allowing Apple engineers to gain industry expertise that was lacking, and thus set the stage for designing the first iPhone which met requirements and had a much nicer user interface.
 
There is some (perhaps a lot of) truth to that. Not with respect to reimagining what a cell phone could be and how people would interact with it, but more so with meeting the technical requirements of various air interface standards (AMPS, GSM, CDMA, W-CDMA, UMTS, 5G, etc).

There's a fine art in interpreting air interface requirements to one's advantage - which is often required. Apple had the drive, imagination, and ability to start with a clean sheet to nail novel user interfaces. But... unlike Motorola/Ericsson/Nokia (collectively known as MEN in the industry) who pretty much wrote the book on cellular telephony, along with the technical requirements, Apple had no experience in that area.
IMHO, the technical aspect of cellular communication should be easier compared to the human-device interface. The I original iPhone was very well received even tho. it debut with only 2G connectivity.

Apple’s strength has always, and still is, on the user experience of the whole widget.

Competitors can only win on one or a few technical area(s).
 
IMHO, the technical aspect of cellular communication should be easier compared to the human-device interface. The I original iPhone was very well received even tho. it debut with only 2G connectivity.

Apple’s strength has always, and still is, on the user experience of the whole widget.

Competitors can only win on one or a few technical area(s).

I’ve had to learn a lot about cellular standards as part of my current career. I don’t know whether it is harder or easier than things like computer, CPU, or OS design, but it is certainly different. As someone who designed CPUs for nearly a decade and a half, I am in no way qualified to do anything relating to cellular radios, communication protocols, or the like. I think the point was simply that Apple needed jumpstarting in these technologies because they were not Apple’s core competency at the beginning of the iPhone era.
 
IMHO, the technical aspect of cellular communication should be easier compared to the human-device interface. The I original iPhone was very well received even tho. it debut with only 2G connectivity.

Apple’s strength has always, and still is, on the user experience of the whole widget.

Competitors can only win on one or a few technical area(s).

Should be, yes. But as I said above, prior to collaborating/working with Motorola on ROKR, Apple had no experince with cellular air interface standards (they're tricky - and there's an art in interpreting them to one's benefit). That was Motorola's/Ericsson's/Nokia's (MEN) wheelhouse/expertise having defined them; and having a deep well of experience in communications systems engineering going back a long time. My contention is Jobs realized the benefits of collaborating with Motorola in getting that experience - even though ROKR turned out to be a dud.

It took Apple and its reimagining what a cellphone human interface could be to bring about that change. Prior to the original iPhone, Motorola/Ericsson/Nokia cellular phones' user interfaces were pretty poor in comparison. In other words, that was Apple's strength.
 
Last edited:
Motorola engineer Martin Cooper, the godfather of cellular telephony, going back 50 years.

Screenshot 2023-05-06 at 7.59.12 PM.png
 
My contention is Jobs realized the benefits of collaborating with Motorola in getting that experience - even though ROKR turned out to be a dud.

Here’s where I’m somewhat skeptical. Apple delivered work that made the ROKR an iPod that talked to iTunes over USB. Other than that, it was a rebadged Motorola E398. To the point that the two devices could be flashed with the other’s firmware and everything worked fine. I’m not sure Motorola would give Apple the opportunity to look at the OS beyond what was needed to deliver the iTunes compatibility.

These OEMs were notoriously protective of their bits, HTC being one of the legends in this space for how tightly they wanted control over things. Everything we got from OEMs back in the day was binary blobs if I remember correctly.

I guess my point is that technical expertise could be bought, while what Apple is doing is ingrained in their work culture.

This is more what I would expect happened. Apple is shrewd enough to buy what they need (including multi-touch tech). Finding a few key engineers with the experience would help a ton here.
 
Back
Top