Nuvia: don’t hold your breath

Citysnaps

Elite Member
Staff Member
Site Donor
Posts
3,601
Reaction score
8,819
Main Camera
iPhone
How about...

2023 is the year when computers using Qualcomm cpus would be released to the public, presumably in volume. Leaving 2022 and a portion of 2021 for chip development, initial/trial fabrication (presumably from Intel) and characterization by Qualcomm with samples delivered to its customers. And then if everything went well, fabrication in production quantities for Qualcomm customers.

Is that amount of time unreasonable? That was basically how our process played out, but we were a very tiny company.
 

thekev

Elite Member
Posts
1,110
Reaction score
1,674
The lawsuit from Apple alleges that Williams exploited Apple technology and poached other Apple employees to join him at Nuvia.

Williams then fired back with his own lawsuit, saying that Apple illegally monitored his text messages and that his so-called “breach of contract” is unenforceable. There has been no resolution in this lawsuit yet.

If he exploited trade secrets, that's probably something. Non-compete and similar agreements are a scourge and should really be unenforceable everywhere, not just California. They are usually unenforceable here.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,210
Reaction score
8,253
If he exploited trade secrets, that's probably something. Non-compete and similar agreements are a scourge and should really be unenforceable everywhere, not just California. They are usually unenforceable here.

I read the complaint, and I believe he is alleged to have used company resources and started the new company on Apple’s dime? I may be misremembering. I think he was recruiting folks from Apple while working at Apple (perhaps through an alleged straw man, my former colleague Manu).

The issue is not breach of any non-compete, though.
 

Nycturne

Elite Member
Posts
1,109
Reaction score
1,417
How about...

2023 is the year when computers using Qualcomm cpus would be released to the public, presumably in volume. Leaving 2022 and a portion of 2021 for chip development, initial/trial fabrication (presumably from Intel) and characterization by Qualcomm with samples delivered to its customers. And then if everything went well, fabrication in production quantities for Qualcomm customers.

Is that amount of time unreasonable? That was basically how our process played out, but we were a very tiny company.

Not unreasonable in terms of development, but may pose issues in keeping up with the competition. It also telegraphs your moves to those competitors which can now move to try to block you out before you even get started.

Here’s the thing, it seems a bit weird to me to use this acquisition to go after the PC space, while ignoring the mobile space where the best Android devices still can’t seemingly keep up with Apple in terms of performance. And to spend time comparing to Apple is weird since it’s really Intel/AMD that will be their competition. This move all up seems more to act as a lever to get their SoCs into the PC space where they are effectively a non-presence today (i.e. larger growth potential). Considering their tendency to ask for royalties on top of the SoC price, I’m not sure how well that will fly in the PC space, but certainly seems like a juicy target for Qualcomm.

That said, I’ve never been super great at the whole “keep shareholders happy” aspect of business. Maybe I’m missing something.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,210
Reaction score
8,253
How about...

2023 is the year when computers using Qualcomm cpus would be released to the public, presumably in volume. Leaving 2022 and a portion of 2021 for chip development, initial/trial fabrication (presumably from Intel) and characterization by Qualcomm with samples delivered to its customers. And then if everything went well, fabrication in production quantities for Qualcomm customers.

Is that amount of time unreasonable? That was basically how our process played out, but we were a very tiny company.

They seem to be claiming the chips were already “redesigned,” which can’t be right otherwise they would be available much sooner. It used to take us 2+ years to design chips (from scratch - not spins) with only 100 million transistores at AMD. Presumably QC’s chip will have tens of billions of transistors. The good news is that Apple has shown you can do it in about a year and a half. (Presumably they leave some performance on the table doing it that way. I think I’ll start a thread about how CPU design works and the trade offs between ASIC and custom methodology).
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,210
Reaction score
8,253
Not unreasonable in terms of development, but may pose issues in keeping up with the competition. It also telegraphs your moves to those competitors which can now move to try to block you out before you even get started.

Here’s the thing, it seems a bit weird to me to use this acquisition to go after the PC space, while ignoring the mobile space where the best Android devices still can’t seemingly keep up with Apple in terms of performance. And to spend time comparing to Apple is weird since it’s really Intel/AMD that will be their competition. This move all up seems more to act as a lever to get their SoCs into the PC space where they are effectively a non-presence today (i.e. larger growth potential). Considering their tendency to ask for royalties on top of the SoC price, I’m not sure how well that will fly in the PC space, but certainly seems like a juicy target for Qualcomm.

That said, I’ve never been super great at the whole “keep shareholders happy” aspect of business. Maybe I’m missing something.

The telegraphing is a very interesting point. If they had something, they’d just shut up and surprise everyone when it’s ready. They are trying to freeze the market precisely because they don’t have confidence.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,210
Reaction score
8,253
If the Snapdragon 920 is ARMv9, they can put it out as AArch64 only, which would save them a big wad of cruft, I suspect. Supporting AArch32+Thumb has got to cost something.

I think with thumb it would be difficult for them to match the wide issue of M1, unless they limit it to some cores, or add a pipe stage for pre-decode. Hard to know for sure without sitting down to sketch out the decode logic. Not nearly as bad as x86, of course, because at least you’re talking about just a couple integer multiple widths. Probably some other implications throughout the pipeline as well.
 

Yoused

up
Posts
5,509
Reaction score
8,682
Location
knee deep in the road apples of the 4 horsemen
A4 or earlier? A4 was the first one. Before that they were using bought SoCs. Wiki-thingy, FWIW, says that they supported A32/T32 from A6 (first real in-house design) through A10, but T32 may be an unfounded assumption. Whoever wrote that probably has no solid proof that there really was T32 in there, unless it can be shown that the xCode/Swift compiler will generate Thumb code.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,210
Reaction score
8,253
A4 or earlier? A4 was the first one. Before that they were using bought SoCs. Wiki-thingy, FWIW, says that they supported A32/T32 from A6 (first real in-house design) through A10, but T32 may be an unfounded assumption. Whoever wrote that probably has no solid proof that there really was T32 in there, unless it can be shown that the xCode/Swift compiler will generate Thumb code.

Interesting, Pretty sure they never used it for anything, if it existed.
 

Nycturne

Elite Member
Posts
1,109
Reaction score
1,417
Interesting, Pretty sure they never used it for anything, if it existed.

Oddly enough, Apple still has their documentation up on ARMv6 and ARMv7 discussing some details on how to avoid getting caught by a gotcha in thumb mode. But it mostly looks like it was meant for folks writing some assembly by hand. I don't remember Xcode ever letting you specify thumb mode all up, but I could be wrong.

The telegraphing is a very interesting point. If they had something, they’d just shut up and surprise everyone when it’s ready. They are trying to freeze the market precisely because they don’t have confidence.

If that's their play, I'm not sure it'll work to their advantage. Anyone willing to step into the AMD/Intel fight now isn't going to be dissuaded by Qualcomm saying they'll have something in 24 months.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,210
Reaction score
8,253
Oddly enough, Apple still has their documentation up on ARMv6 and ARMv7 discussing some details on how to avoid getting caught by a gotcha in thumb mode. But it mostly looks like it was meant for folks writing some assembly by hand. I don't remember Xcode ever letting you specify thumb mode all up, but I could be wrong.



If that's their play, I'm not sure it'll work to their advantage. Anyone willing to step into the AMD/Intel fight now isn't going to be dissuaded by Qualcomm saying they'll have something in 24 months.

Who knows.
 

thekev

Elite Member
Posts
1,110
Reaction score
1,674
I read the complaint, and I believe he is alleged to have used company resources and started the new company on Apple’s dime? I may be misremembering. I think he was recruiting folks from Apple while working at Apple (perhaps through an alleged straw man, my former colleague Manu).

The issue is not breach of any non-compete, though.

So yeah, that stuff is actually an issue, albeit a bit Machiavellian, which actually brings me some amusement. I always figured The Prince was just some guy writing about his cat.
 

NT1440

Power User
Posts
194
Reaction score
216
What are the odds that Qualcomm’s chips will be “good enough” for the windows world? Any takes on whether it’s feasible these will be near/exceed M1 (which will be old hat by then) performance?

Basically, will this make Windows machines suck or will it just be a new flavor of suckage?
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,210
Reaction score
8,253
What are the odds that Qualcomm’s chips will be “good enough” for the windows world? Any takes on whether it’s feasible these will be near/exceed M1 (which will be old hat by then) performance?

Basically, will this make Windows machines suck or will it just be a new flavor of suckage?

Hard to guess. I would imagine they would be “good enough” for at least some part of the Windows market. The question is whether “good enough” is compelling enough to get anyone to switch from x86. Qualcomm has provided little guidance about what they are trying to achieve. But it seems to me that in order to sell Arm to windows customers you have to provide something pretty compelling, so they should be aiming to be competitive performance-wise, but at a much lower power expenditure.

I suspect we will see M1-like performance, for what it’s worth, but in 2023 that may not be good enough.
 
Top Bottom
1 2