Nuvia: don’t hold your breath

I'm not sure Qualcomm plans to actually use the regulators to win on these arguments, but just as a tactic to continue to pressure ARM to drop their own case. Basically: "you're smaller than we are and we don't have to win but we'll drown you in counterclaims and regulatory complaints until you give up". I'm not sure if ARM is small enough to buckle under that, but that seems to be the strategy and I guess Qualcomm considers every bit of pressure to be a win.

Having a lot of experience representing Arm-sized and Qualcomm-sized companies, this isn’t going to do much to pressure Arm. What you’ll see, instead, is Arm raising antitrust counterclaims against Qualcomm around the world.
 
Don't think it's sketchy. We saw the doubt about the 8 Elite improvement over the Snapdragon X Elite (because it was yielding 4.3GHz standard, and, if that was the frequency profile in a phone, it suggested massive improvement in power - which happened). It also matches the previous rumors fwiw.

N3E to N3P gets a 5% iso-power, iso-arch boost in frequency, SME is a 5% gain or so, and after that point they need about 15% more general IPC without blowing power up.

Looking at the cache hierarchy size and room to run there to reduce data movement- they have L2 (12MB) and the SLC (8MB), and also a substantially smaller core. With the rate of improvement from Oryon V1 to V2 and where Oryon IPC is at the moment, I don't find this that unlikely - keep in mind though their phone CPU peak power is still 10-15% higher than Apple on ST, and at 10-15% less performance, so there's that.

In other words If A19 has a 10% gain or so without any power increase, and this is true, Apple will still be on top, albeit by less than ever considering the curve - this I think is pretty much guaranteed [smaller margins than ever, tho this is already true].

Anyway I would be happy to bet on a >|15% gain in performance iso-power in GB6 ST for the 8 Elite 2. 25-30% is pushing it but not insane in context.
 
IMO, the GPU is more impressive. sucks it will be stuck with phone level games slop
Yeah and it's Android games which is even worse. Mobile gaming is so underutilized, we barely have anything I'd consider Nintendo-class other than Genshin, even though it's perfectly possible. Apple could create clones of some big Nintendo games for Mac and iPhone with the lift of a thumb IMO

At any rate pending Nvidia's WOA entry games and anti-cheat software will probably get ported more often, and this GPU, pending drivers as well might be pretty good for thin and light laptops (but again suspect Nvidia will do much better, shouldn't be a contest)
 
not a laptop, but a tablet running windows
Whether a 2-1 with a desktop OS counts as a laptop or tablet is a philosophical question I don’t intend to resolve. It’s a fanless Windows device with a Snapdragon processor and as far as I (or the article author is aware) the first of its kind when such probably should have been the priority.
 
So how have they been selling? I've seen universities and companies who explicitly said students and employees shouldn't buy these due to software compatibility issues and even its lack of GPU options if it involves things like CAD and Blender. If you're buying a Windows laptop the biggest appeal other than familiarity is the combination of software compatibility, a wide range of options, and pricing. Seems Qualcomm-powered devices lose a lot of that.

However there's definitely a market for "true" 2-in-1 laptops that'll let you use stylus with desktop apps, and I still think Apple will address it at some point.
 
If true, I’m curious why they would push frequency this much?

I don’t understand the point of this article. It says they are testing at >5, but that the final speed will be determined by yadda yadda yadda.

That’s how device qualification works.

We design a chip to run at X frequency, and when we get samples back, we test at ½ X, X, 2X, etc. For many reasons. For example, if it fails at speeds >X, and we diagnose the circuit path that fails and find that it’s consistent, we can compare that to what our modeling told us and potentially improve our modeling. We also compare power consumption at different frequencies to our predictions, identify potentially unpredicted thermal issues, etc.

All of that provides data that we can feed back into our analysis tools so that next time we can make an even better chip.

It doesn’t mean we have any intention to ship the chips at those frequencies, and the article doesn’t claim they intend to do that either. The point being that every chipmaker does this.
 
Back
Top