Nuvia: don’t hold your breath

There are a couple of new videos out about the X Elite. The first one from MaxTech goes over some benchmark numbers. They boast of their NPU getting 45 tops (unspecified precision) vs M3 18. Still seems a slippery to me. Also it seems like they are hinting there will be no 80 watt version. Not surprising.



I've only watched the first video so far. Sorta watching the second.

Focusing on the CPU, I think @leman and others are entirely correct that this is almost exactly what one would expect from a 12 P-core M2 CPU that uses, I can't quite remember the technical x86 term, variable clock speed to achieve its scaling (dynamic voltage something? DFVS? I'm too tired to look it up) and has no E-cores. I would've liked to seen comparisons with AMD here, as @exoticspice1 pointed out, especially in x86 laptops they, not Intel are the ones to beat in terms or performance and performance/W even if Intel is still the larger chipmaker.

I am torn. I do wish they had been bought by someone other than Qualcomm if nothing else because going after x86 in the server space is where x86 makes much of its profitability. On the other hand releasing quite a good 1st generation product for consumers could also spur growth of ARM into the PC space as well. This can even have a small knock-on positive effects for Apple consumers (thinking of Wine/CrossOver only having to emulate Windows rather than also go through Rosetta and greater optimizations for ARM overall) even if for Apple this could slow growth against the Windows ecosystem which is one reason why Microsoft was so hot-under-the-collar to get a good ARM consumer chip.

An interesting aspect of this is that we get a window into what an Apple chip would like without their E-cores. It turns out scaling clock speed just by itself is pretty damn good if you have a good P-core design! This is probably one reason why AMD has been slower to adopt E-cores in its consumer designs than Intel was. ;) However, Qualcomm compared to Apple is likely operating at lower margins here. This is nothing new in the PC space relative to Apple but a significant disadvantage of relying on P-cores is that Qualcomm has to spend a lot more silicon to achieve their multicore performance at their target wattage. It’s great per mm^2 if you’re building an 80W monster (and looking at some Windows laptops I’m not convinced that MaxTech is right here and even then he was only skeptical about laptops not going there), but Apple having such good E-cores is a big advantage in the M3 and M3 Pro range.

The GPU performance basically confirms our earlier analysis of the Adreno GPU performance: it’s fine, but the hype was unwarranted. I know people were not thrilled about the hype around Nuvia/Oryon but matching the M2 in their first outing is better than just fine, it’s damn good, especially for the PC space. Maybe I’m doing the GPU a disservice, but … at least in raster performance there’s nothing terribly special here at first glance. And we know Qualcomm’s compute performance has historically been terrible, but maybe they’ll focus on those drivers now that they have a PC chip.

The Qualcomm NPU is interesting, obviously a big focus for them. It sounds like Apple will also be looking to make theirs much bigger in the next generation as well and it should be interesting to see what else Apple does with matrix units, ie the GPU, if anything.

I do not understand Cornaby’s comment. Qualcomm has its own GPU architecture, which is a TBR rasterizer with small tiles. It’s not TBDR and never was. And they never used PowerVR IP?

I guess he just always assumed it was a TBDR design?
 
Last edited:
nice to Qualcomm supporting all.
Not just DirectX
“Qualcomm says it has Adreno GPU drivers for DX11, DX12, Vulkan, and OpenCL and will also support DX9 and up to OpenGL 4.6 via mapping layers.”

After this I am positive about this. Now, for the reviews in June.
Any information yet on Linux drivers?
 

🙃

Apple can also optimize the performance of MacBooks since it controls both the hardware and macOS software.

This is becoming less of an issue, though. Snapdragon is a SoC, so if Windows is targeting it, there is not much specialized hardware on a typical MBd. The SSD might make a difference, but only in terms of the performance it can yield.
 
It appears that tomorrow will see the launch of some X Elite models. There are some new scores which were shown at a press event today. They are good. Around 2900. No leaks about power use unfortunately. Congratulations!
 
Last edited:
It appears that tomorrow will see the launch of come X Elite models. There are some new scores which were shown at a press event today. They are good. Around 2900. No leaks about power use unfortunately. Congratulations!

This thread started Nov 2021. Hope nobody was holding their breath :-)
 
These seem like serious allegations. Anyone know if this site is reputable?
Yes. The guy is a known quantity, mostly focuses on the business end of tech online. I’ve seen him interact with other tech people. He’s very … outspoken with a lot of strong opinions. I hope he’s mistaken …
 
Yes. The guy is a known quantity, mostly focuses on the business end of tech online. I’ve seen him interact with other tech people. He’s very … outspoken with a lot of strong opinions. I hope he’s mistaken …
Would you say he’s reliable?
 
Would you say he’s reliable?
For the most part but a lot of what he does is behind a paywall. I’m not on twitter anymore but when I was there the conversations I saw him have with others in tech definitely showed mutual respect though as I said he could be very opinionated. And he has a special disdain when he feels companies are pulling a fast one. You can tell that by the tone of his article here.

That doesn’t mean he always right but this is something to keep an eye on. It seems he’s blaming the software stack above the silicon. I wonder if that means Linux will be okay but Windows isn’t? He seems to single out Windows on ARM which he still calls WART (he’s not a MS fan … or Intel, while I’ve seen him rant about just every tech company he has a special ire towards them).
 
It seems that Charlie has a bone to pick with Qualcomm :) Some time ago he published an article claiming that Qualcomm uses mobile phone power converters that are not sufficient for the task of powering the Oryon. The thing is, he is sometimes a bit sensationalist, so I didn't take it to seriously, but given the huge power draw disparity between single and multi-core operation on Oryon I now think his information might be accurate.
 
I really struggle to understand some of these recent scores. They are all over the place.
1713982667098.png
 
Back
Top