New Apple CPU rumors

Cmaier

Site Master
Staff Member
Site Donor
Top Poster Of Month
Joined
Sep 26, 2021
Posts
6,254
Main Camera
Sony
This guy has been fairly accurate in the past, and this matches what I’ve heard, which is that there will be some sort of “M1” with avalanche/blizzard, and that M2 is a bigger leap (to be clear, I haven’t heard any product names - just that there have been chips taped out that are M1-like with avalanche/blizzard and chips with new cores and GPU have also been taped out on a more advanced node. I haven’t heard anything personally about the A-series processors, though.

1653583626047.png
 
Is this much different than what we already expected? Despite what Shrimp is saying, I think most of us surmised that the followup to the M1, which we assumed would be called M2, would be a relatively minor tweak based upon the A15. Then the next chip after that, ostensibly the M3, would be on TSMC's 3nm process, and likely be a more substantial upgrade. Other than the potentially confusing issue surrounding marketing names, Apple adopting ARMv9 is the most notably thing mentioned.

Or am I missing something, @Cmaier?
 
Is this much different than what we already expected? Despite what Shrimp is saying, I think most of us surmised that the followup to the M1, which we assumed would be called M2, would be a relatively minor tweak based upon the A15. Then the next chip after that, ostensibly the M3, would be on TSMC's 3nm process, and likely be a more substantial upgrade. Other than the potentially confusing issue surrounding marketing names, Apple adopting ARMv9 is the most notably thing mentioned.

Or am I missing something, @Cmaier?

I think the implication is a little different than what some have expected (though I think I posted something along these lines either here or at MR previously). I think most people expected that the next thing coming will be Avalanche-based variations of all the existing M1 chips. Instead, I think Avalanche is going to go into just 1 or 2 variations, probably at the low end just replacing plain ol’ M1 (for MacBook Air, iPad Pro, and maybe a low-end MBP), and that the next ”full set” of M-chips will be based on a completely new core. I also suspect, if true, that this means that by next year the M-series chips may come out with new cores before the A-series.
 
Avalanche and blizzard showed nice improvements in the latest iPhone relative to the prior iPhone.
Anandtechs breakdown evidenced significant energy efficiency gains going from a14 to a15 through solid engineering on the e-cores rather than solely relying on process improvements solely for performance per watt gains.
Appreciating that folks are looking for the second coming of jebus-level-chips with an insane 20% IPC improvement year on year (at least over at the other place so that crab cakes chess benchmarks and other niche benchmarks give nicer bragging rights ),….this is still a really nice update to look forward to.
It should give m1 mba a nice battery improvement over 2020 mba!
Gotta wonder how apple will make this…
Possible a new design mba with “M1 avalanche” variant but with a smaller battery allowing apple to save on smaller battery but still advertised with a similar battery life to 2020??
 
I think the implication is a little different than what some have expected (though I think I posted something along these lines either here or at MR previously). I think most people expected that the next thing coming will be Avalanche-based variations of all the existing M1 chips. Instead, I think Avalanche is going to go into just 1 or 2 variations, probably at the low end just replacing plain ol’ M1 (for MacBook Air, iPad Pro, and maybe a low-end MBP), and that the next ”full set” of M-chips will be based on a completely new core. I also suspect, if true, that this means that by next year the M-series chips may come out with new cores before the A-series.
Interesting, thanks for the response. It wasn't that long ago that the Mac appeared to be an afterthought among Apple's management. Even if that wasn't the case, it certainly felt like a side project, at times. It would be wild to see new cores appear inside the Mac before the iDevices. The difficulty, at least right now, is that we only have a sample size of one with the M-series. So, a lot of conventional assumptions may not be correct, in regards to the trajectory that Apple takes with its future processors.
 
Avalanche and blizzard showed nice improvements in the latest iPhone relative to the prior iPhone.
Anandtechs breakdown evidenced significant energy efficiency gains going from a14 to a15 through solid engineering on the e-cores rather than solely relying on process improvements solely for performance per watt gains.
Appreciating that folks are looking for the second coming of jebus-level-chips with an insane 20% IPC improvement year on year (at least over at the other place so that crab cakes chess benchmarks and other niche benchmarks give nicer bragging rights ),….this is still a really nice update to look forward to.
It should give m1 mba a nice battery improvement over 2020 mba!
Gotta wonder how apple will make this…
Possible a new design mba with “M1 avalanche” variant but with a smaller battery allowing apple to save on smaller battery but still advertised with a similar battery life to 2020??
Yeah, I think blizzard, in particular, is a pretty huge upgrade. And avalanche is probably more clock scalable, which could result in higher frequency bins in some products. I think, though, that the roll out has gone more slowly than they anticipated (because of the pandemic and supply chain issues, and because M1 was so well-received that they didn’t need to rush out its replacement), so they may have adjusted the roadmap to essentially deliver what was going to be M2 as ”M1X” (or whatever), and then quickly jump to what was going to be M3 (and calling it M2).
 
I guess another possibility is that the avalanche/blizzard chip is for Mac Pro? I mean, if Apple is going to do one-off, that would be a good machine to do it for.
 
Sounds like Shrimp's source is familiar with the underlying technology, at least to some degree, but is just guessing about the actual marketing names. This doesn't sound like anything new. This aligns with what Gurman was saying back in April about the M2, minus the specification about Avalanche/Blizzard, which is what most of us assumed. Gurman claims that the standard M2 will have 8 CPU cores, unchanged from the M1, while upping the GPU count from 8 cores to 9 or 10 cores. It's good to have confirmation, but Shrimp isn't bringing much new to the table, other than confirming what cores are used, and that Apple will eventually adopt ARMv9 with TSMC's 3nm process.

I wonder if Shrimp is right and this is a "huge upgrade" from M1, because this appears evolutionary, not revolutionary. We already got that with the switch from x86 to Apple Silicon.
 
Sounds like Shrimp's source is familiar with the underlying technology, at least to some degree, but is just guessing about the actual marketing names. This doesn't sound like anything new. This aligns with what Gurman was saying back in April about the M2, minus the specification about Avalanche/Blizzard, which is what most of us assumed. Gurman claims that the standard M2 will have 8 CPU cores, unchanged from the M1, while upping the GPU count from 8 cores to 9 or 10 cores. It's good to have confirmation, but Shrimp isn't bringing much new to the table, other than confirming what cores are used, and that Apple will eventually adopt ARMv9 with TSMC's 3nm process.

I wonder if Shrimp is right and this is a "huge upgrade" from M1, because this appears evolutionary, not revolutionary. We already got that with the switch from x86 to Apple Silicon.

When I was designing chips for AMD, I never knew the marketing names while I was working on them (and seldom kept track after they were actually on sale), so if the source is in engineering this wouldn’t surprise me. We had internal names and sometimes a chip with one internal name had two different marketing names, or vice versa.

I think the only news here, if there is any news, is that it sounds like the v9/3nm stuff may be happening relatively rapidly as compared to the M1 product cycle, though he hasn’t really said that.
 
Is anything interesting coming with the ARMv9 ISA?
Some improvements to the security model, and more flexibility (and maybe performance) in SIMD stuff, mainly, I think.
 
Some improvements to the security model, and more flexibility (and maybe performance) in SIMD stuff, mainly, I think.
Back when ARMv9 was announced, you didn't seem too impressed with it, and weren't sure that it really deserved a full increment, being more along the lines of a +0.1 release. Regardless of naming, @leman seemed really jazzed about the addition of SVE2, but there didn't seem to be much else that Apple didn't already have an equivalent for. The security enhancements appear to mirror functionality that Apple has already including in their designs.

Still, this is the first time we've seen anything resembling confirmation that Apple Silicon will support v9, not that it's a huge surprise. I recall an Apple engineer on Twitter saying that they collaborate with Arm and that a number of enhancements to the instruction set are a result of Apple's feedback. Is there anything that you think that Apple might find useful with v9, @Cmaier, or do you think Apple is mainly just wanting to keep up to spec? In other words, will Apple find any real use for the security extensions, or other additions, particularly within macOS?
 
Back when ARMv9 was announced, you didn't seem too impressed with it, and weren't sure that it really deserved a full increment, being more along the lines of a +0.1 release. Regardless of naming, @leman seemed really jazzed about the addition of SVE2, but there didn't seem to be much else that Apple didn't already have an equivalent for. The security enhancements appear to mirror functionality that Apple has already including in their designs.

Still, this is the first time we've seen anything resembling confirmation that Apple Silicon will support v9, not that it's a huge surprise. I recall an Apple engineer on Twitter saying that they collaborate with Arm and that a number of enhancements to the instruction set are a result of Apple's feedback. Is there anything that you think that Apple might find useful with v9, @Cmaier, or do you think Apple is mainly just wanting to keep up to spec? In other words, will Apple find any real use for the security extensions, or other additions, particularly within macOS?

I think they’re just keeping up. They’ll certainly use the enhancements, but I doubt anyone would notice much change.
 
When I was designing chips for AMD, I never knew the marketing names while I was working on them (and seldom kept track after they were actually on sale), so if the source is in engineering this wouldn’t surprise me. We had internal names and sometimes a chip with one internal name had two different marketing names, or vice versa.
And the same is generally true on the software side in my experience. Less so now that everything is "ship everything, everywhere, all the time".

Large projects I've worked on had the mentality of "Let's make cheeky codenames that make the lawyers unhappy".
 
Large projects I've worked on had the mentality of "Let's make cheeky codenames that make the lawyers unhappy".
Back when there was a third x86 competitor, Cyrix was sent a cease and desist by Lucasfilm for using Star Wars trademarks as CPU codenames. For instance, one preliminary x86 design was codenamed "Jedi". Keep in mind that these were internal designations, not something meant for public release, and were leaked to the tabloid tech press, at the time personified by the Register. As @Cmaier has pointed out, this is common practice within tech circles, such as naming conference rooms after science fiction locations, but this demonstrates how protective Lucasfilm has been of its IP. So, yes, it really does make lawyers unhappy when nerds decide to use their favorite fictional sci-fi settings as inspirational nicknames. Cyrix ultimately agreed to give their chips new internal codenames, even though marketing and legal originally had nothing to do with it. I question whether this was legally enforceable, but there was zero motivation for Cyrix to fight over a pet name for pre-release silicon.
 
Back when there was a third x86 competitor, Cyrix was sent a cease and desist by Lucasfilm for using Star Wars trademarks as CPU codenames. For instance, one preliminary x86 design was codenamed "Jedi". Keep in mind that these were internal designations, not something meant for public release, and were leaked to the tabloid tech press, at the time personified by the Register. As @Cmaier has pointed out, this is common practice within tech circles, such as naming conference rooms after science fiction locations, but this demonstrates how protective Lucasfilm has been of its IP. So, yes, it really does make lawyers unhappy when nerds decide to use their favorite fictional sci-fi settings as inspirational nicknames. Cyrix ultimately agreed to give their chips new internal codenames, even though marketing and legal originally had nothing to do with it. I question whether this was legally enforceable, but there was zero motivation for Cyrix to fight over a pet name for pre-release silicon.

In our case, it was only semi-related to pop culture, but it was a trademarked name, and it was a reference to a particular weed killer that you can still buy. It's been almost a couple decades now, but the Cyrix mention does ring a bell. But I do know we had lawyers look over the codenames specifically to avoid a Cyrix-like lawsuit.

But man, I remember Cyrix. Used to have a PC compatibility card using one of their 5x86 CPUs.
 
In our case, it was only semi-related to pop culture, but it was a trademarked name, and it was a reference to a particular weed killer that you can still buy. It's been almost a couple decades now, but the Cyrix mention does ring a bell. But I do know we had lawyers look over the codenames specifically to avoid a Cyrix-like lawsuit.

But man, I remember Cyrix. Used to have a PC compatibility card using one of their 5x86 CPUs.
Yep. In my day there were x86 projects all over the place. I worked at a place that secretly was working on a chip that could execute both x86 and another architecture, then changed their mind. I had an offer at Rise, which was doing an x86. I interviewed at IBM in vermont in 1995 or so, and they were working on one. Of course you had Transmeta (sort of x86). IDT/Centaur/Via. National. Of course I worked at NexGen as it was acquired by AMD,. Chips and Technologies. There were a bunch. Most of them used as IBM as their fab, because IBM had a license to x86.
 
This is somewhat adjacent to new CPU rumors, but AMD is now doing damage control over power usage with their recently announced, upcoming AM5 platform. Both Intel and AMD use TDP. Intel only uses PL1 and PL2. AMD is using EDC, TDC and PPT. Whichever convoluted spec you chose, Intel maxes out at 241W on LGA1700. The controversy is that AMD implied that AM5 would only represent a modest increase to 170W PPT, crowing about their efficiency advantage, but AMD has clarified that it will go as high as 230W.

Back when the switch was first announced, there was some criticism of Apple for choosing to use their in-house microarchitecture, instead of simply switching to AMD, which was doing much better in power consumption during the 14nm++++++ period that chronically plagued Intel. Now, it's clear that Intel and AMD are going to continue to try to squeeze every last megahertz out of their chips, in order to one up each other, power consumption be damned, as long as they win by 2% on synthetic benchmarks. In an attempt to explain this to enthusiasts and component partners, they've introduced varied ways of measuring power usage, muddying the waters, making it impossible to pin down a specific number. This is even worse on the GPU front, with Nvidia potentially hitting 600W with Lovelace, and the possibility of specialty cards from partners that could go higher. Given how reactionary the PC guys have become, I'm sure AMD will retaliate in kind, and if Intel ever cobbles together its graphics card business into something resembling a functional division, they will do the same.

As a fan of next-generation technology, particularly on the desktop, this is something that I am exposed to constantly when reading the latest news and rumors, like it or not. However, as a Mac user, I appreciate that I don't have to worry about any of this high wattage measuring contest. Since Apple controls the whole stack, hardware and software, manufacturing the entire widget from top to bottom, I don't have to concern myself with this largely arbitrary, made up nomenclature, and let Apple handle the details. Today, it's abundantly obvious why Apple has decided to do the engineering themselves and left the inefficient x86 PC designs behind. (There's a reason that undervolting is a thing among PC enthusiasts.)

From my perspective, there are three fundamental hardware aspects of any desktop computer:
1. Performance.
2. Size and weight.
3. Noise.

With a mainstream Windows PC, you can have maybe two of those, but not all three. With the switch to Apple Silicon, controlling the entire experience, Apple is able to release Macs that are fast, small, light weight, and quiet. I also don't have to worry about blowing a fuse, having a space heater sitting on my desk, or a massive electricity bill, because Apple Silicon is inherently power efficient. Mac users are spoiled to the point where we don't even have to think about power consumption, because it's a non-factor. No amount of hand waving, rebranding, and arbitrary power specifications are going to make up for that innate advantage that Apple has, which is a gigantic competitive advantage, and is one which Mac users take for granted.
 
Back
Top