Nuvia: don’t hold your breath

A lot of this depends on how well interfaces evolve. Will some LLM like ChatGPT make it possible to get reliable interaction with an interface without heavy reliance on keyboard, mouse, or touch? If so, I could see wearables take off, so long as people don’t go mad with all the chattering everyone is doing with their devices, or people waving around in front of themselves trying to use gesture-based inputs.

I think this is inevitable, and if the device is mixed reality it can see your hands anyway so you can use those to gesture to on-display widgets.

Also i don’t necessarily think people will be wearing a headset all day every day. It will be part of a set of wearable gear - the watch being the start. And no, wearables won’t be all things to all people, but the majority of people both in and outside of work don’t necessarily need a desktop or a laptop. They need access to reference information, communications and ways to calculate information.

None of these things necessarily require the desktop UI paradigm and if they can be done “out in the field” much better for those people who don’t have huge compute needs.

Things like chatGPT will certainly help here for sure. To enable things like “Hey XXX how much money did we spend on safety equipment last year per new hire” or “hey XXX please pull up the repair manual for this truck” or “hey XXX what component am i looking at?”.

Huge numbers of people are desk bound working on reporting for the typical enterprise and if the data is all accessible in database form, an AI can simply build queries or full reports against it for you on the fly. Without an end user needing to know how to build them.
 
Decades ago, someone developed an accessibilty mouse for quadriplegics that moved the cursor based on where you were looking (tracked your gaze). If Apple is developing AR glasses, perhaps they will include alternative interface sensors that can accumulate gestures that do not require waving your hands around.

The microsoft hololens has been doing eyeball tracking for several years now and you can get Tobii eye trackers for PC (not sure if they work on mac) already.
 
Eye tracking is fairly robust technology nowadays. We use them for language processing experiments and cognitive development studies. i think one of our lab has glasses that cost less than 200$ and deliver very good results.
 
Knowing some of his goals, absolutely. He wanted to see computing as something that was just part of the fabric of your home/office/etc.
Most people don’t want a computer per se. They just want something to do the things they need to do.
I think you're both absolutely right. Part of why I was stressing over my Mac vs. PC debate is because I'm not simply choosing between two identical tools, despite appearing functionally similar. With a PC, I'd be cobbling together various parts from dozens of manufacturers, with the same on the software side. There's a lot of flexibility in that, but can come with substantial downsides, as well. I used to revel in the chaos of it all, but now, not so much anymore.

When I purchase a Mac, it's not just for macOS, or Apple Silicon, or iCloud. What Apple offers is the integration between hardware/software/services that simply can't be matched by the PC vendors.
 
I think you're both absolutely right. Part of why I was stressing over my Mac vs. PC debate is because I'm not simply choosing between two identical tools, despite appearing functionally similar. With a PC, I'd be cobbling together various parts from dozens of manufacturers, with the same on the software side. There's a lot of flexibility in that, but can come with substantial downsides, as well. I used to revel in the chaos of it all, but now, not so much anymore.

I've had this discussion with people over when the "Year of the linux desktop" will finally happen for years.

My prediction is that the year of the linux desktop will be somewhat after the majority of end users migrate off the desktop as their primary platform. Some could argue that this already happened with mobile phones being the primary (in some cases, only) compute device for many people and that "android is linux" (that's a bit of a crock imho, but that's just my view).

Either way, i think the desktop metaphor will be relegated to niche use cases (3d design, code, etc.) prior to 2030 and the majority of people's compute time will be satisfied with wear-ables and tablets. Especially for personal use - enterprise will lag somewhat but that's because they've got huge investments in desktop driven legacy accounting software.
 
1692813104172.png

Yikes if true!
 
Not surprising given the delays! It would’ve been solid a few years back, but launching this alongside or after A17/M3… yikes 😅

The end product might not be too bad if efficiency is on point, but it’s hard to imagine Qualcomm/Nuvia matching Arm on that front.

Anyway, forget this Nuvia stuff! Qualcomm’s RISC-V IP is 100% definitely guaranteed to beat everything! 2nd, 3rd… erm, nth times a charm! 😂
 
Could be preproduction teething numbers? But yeah big yikes if that’s the final product

Pre-production numbers almost always affect clock speed, not IPC. Unless they’re planning to double the clock or something, it’s not looking great.
 
I forget, what is the GB5->GB6 conversion factor?
I don’t know how universal this is but a base M1 scored (in single thread) roughly 1700 in GB5 and 2300 in GB6. A base M2 scored ~1900/2600. So if the numbers are accurate and it’s launching late this year/early next year on N3 as rumored, then it’s roughly equivalent to something in between an M1 and M2 on N5/+ and requiring slightly higher clocks than even the M2 to get there. An X3 in a phone scored 1500/1900 (a worse GB 5/6 ratio, unsure why - different sources for all the above so 🤷‍♂️), so one would expect better from a laptop variant with much higher clocks but I couldn’t trivially verify that an X3 is equivalent nor what it’s power draw at that clock speed/cache setup.

So it’s not truly awful, but not great, especially given the hype. That said it’ll be a big uplift compared to Qualcomm’s previous laptops… if only because those previous ones were so phoned in. ;)
 
Last edited:
I would assume that ARM designs cores for each likely process they are likely to be used on, as Samsung N3 is significantly different from TSMC's and each calls for particular layout schemes.
 
I would assume that ARM designs cores for each likely process they are likely to be used on, as Samsung N3 is significantly different from TSMC's and each calls for particular layout schemes.
Yes.
 
If these leaks are correct, it's really bad news for Qualcomm. That's basically sub A14 performance per clock. Not enough to challenge even the current low-end mobile x86.
 
The big day approaches….soonish.


X gone give it you! Maybe.

It’ll be interesting to see how Apple is in trouble today for a SoC delivered in a year.

Snark aside, if they actually do deliver the kind of performance that leads to people leaving x86 for this, then it might actually be the kind of thing to help Mac games. Currently the game pipelines are optimised for the discreet GPU machine. Apple seems to have issues getting devs to do the work to optimised for a unified memory architecture. If that becomes the norm in PC land (big if) then performance could improve on Mac games. Or it just might lead to nothing!
 
Last edited:
The big day approaches….soonish.


X gone give it you! Maybe.

It’ll be interesting to see how Apple is in trouble today for a SoC delivered in a year.
ARM asks for restraining order in…
 
Back
Top