WWDC 2023 Thread

Status
The first post of this thread is a WikiPost and can be edited by anyone with the appropiate permissions. Your edits will be public.

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,306
Reaction score
8,467
Between Gurman's rabid last-minute reporting, and your subtle hints, I've decided to watch WWDC, since the Mac news sounds more substantial than just a bigger MacBook Air and a macOS bug fix.

That being said, if you're wrong, we're taking your remaining functional limbs.

View attachment 24112

“And there’s one more mac left…which is a story for another day. When Colstan isn’t listening. Nobody tell Colstan.”
 

B01L

SlackMaster
Posts
173
Reaction score
129
Location
Diagonally parked in a parallel universe...
With 64 GB chips, I calculate up to 2 TB for an Extreme (4x Max):

Currently, the Ultra (2x Max) supports up to 16 chips x 8 GB/chip = 128 GB, so an Extreme (4x Max) would be 32 chips. Thus increasing the chip size to 64 GB should allow the Extreme up to 32 chips x 64 GB/chip = 2 TB RAM.

Apple-M1-Ultra-chipset-220308_big.jpg.large.jpg


Four chips for M1 Max
Eight chips for M1 Ultra (pictured above)
Sixteen chips for Mn Extreme

Sixteen x 64GB = 1TB
 

theorist9

Site Champ
Posts
612
Reaction score
560
Apple-M1-Ultra-chipset-220308_big.jpg.large.jpg


Four chips for M1 Max
Eight chips for M1 Ultra (pictured above)
Sixteen chips for Mn Extreme

Sixteen x 64GB = 1TB
I suspect those eight packages in the M1 Ultra actually contain two memory dies each, giving a total of 16. Specifically, I thought the LPDDR5 in the M1 chips was limited to 8 GB/die (they didn't introduce the 12 GB dies until the M2), and thus the way the M1 Ultra achieved 128 GB RAM was with 16 dies x 8 GB/die = 128 GB. For the M1 Ultra to have achieved 128 GB with just 8 dies would require 16 GB LPDDR5 dies, and I didn't think Apple was using that in the M1. According to Micron, the max density available with LPDDR5 is 12 GB (https://www.micron.com/products/dram/lpddr5/).

Of course, to determine this for certain, you'd need to identify the component parts. Unfortunately, I couldn't track down info for the Max, but iFixit did a teardown of the base M2 Air (8 GB), and found it had two SK Hynix H58G56AK6HX052 4 GB LPDDR5 RAM dies; the 16 GB and 24 GB models are presumably 2 x 8 GB and 2 x 12 GB, respectively:

1685599617141.png


According to another iFixit teardown ( https://www.ifixit.com/News/71442/tearing-down-the-14-macbook-pro-with-apples-help ), the base "M2 Pro has two SK Hynix 4GB LPDDR5 RAM modules on either side of the core—a total of four. These are the very same RAM modules we found in the M2 MacBook Air." So:

M2: 2 dies*, up to 12 GB each => 24 GB max
M2 Pro: 4 dies*, up to 8 GB each => 32 GB max
M2 Max: 8 dies**, up to 12 GB each => 96 GB max
M2 Ultra: 16 dies**
M2 Extreme: 32 dies**

*Reported by iFixit
**Assumed based on successive doubling
 
Last edited:

Citysnaps

Elite Member
Staff Member
Site Donor
Posts
3,688
Reaction score
8,986
Main Camera
iPhone
I've been closely watching comment reactions at the other place with respect to Apple's upcoming entry into AR, going on at least 3-4 years now (it seems). It's been pretty ugly, with most people refusing to unlock their imagination and just dream a little, or even doing a wee bit of research. Instead, their minds are made up that it'll be a dud. In a word, frustrating.

While it seems many are still in the "it's going to flop" camp, I'm seeing some who are now speculating about real life AR possibilities. That's encouraging. I'm looking forward to seeing reactions there to next Monday's keynote presentation.
 

Roller

Elite Member
Posts
1,438
Reaction score
2,803
I've been closely watching comment reactions at the other place with respect to Apple's upcoming entry into AR, going on at least 3-4 years now (it seems). It's been pretty ugly, with most people refusing to unlock their imagination and just dream a little, or even doing a wee bit of research. Instead, their minds are made up that it'll be a dud. In a word, frustrating.

While it seems many are still in the "it's going to flop" camp, I'm seeing some who are now speculating about real life AR possibilities. That's encouraging. I'm looking forward to seeing reactions there to next Monday's keynote presentation.
It's both amusing and frustrating, but typical for MR no matter what the product category. People post with such certainty.

I have no way of predicting if the AR/VR headset will be successful, but I can think of a lot of real-world applications, especially in vertical markets. I'm also looking forward to Monday's presentation more than I have any keynote in recent years, with announcements of new Macs and previews of iOS/iPadOS 17 and macOS 14 in addition to the other stuff.
 

Citysnaps

Elite Member
Staff Member
Site Donor
Posts
3,688
Reaction score
8,986
Main Camera
iPhone
It's both amusing and frustrating, but typical for MR no matter what the product category. People post with such certainty.

The part I'll never get, is the apparent pride many seem to have keeping their imaginations tightly closed to new ideas and what might be possible. Why?
 

Yoused

up
Posts
5,610
Reaction score
8,922
Location
knee deep in the road apples of the 4 horsemen
M2 Extreme: 32 dies
I remain skeptical about this. The mockups I see of the "Extreme" show 4 SoC tiles in a rectangle: that would not double the RAM die count, because half of the memory interface connections would be along the inside line, between the tiles. And if the implication is a linear interconnect, that has other issues, like signal-path delays and blocking half the die-end connections. "Extreme" just sounds like a fantasy.

I could maybe see some kind of setup where you have extra GPU core tiles stacked side-by-side on the end of a Max tile, but that only gets you 24, maybe 28 RAM controllers. Apple has to have some other strategy in mind for the Mac Pro – already, the Ultra struggles with attaining full resource saturation.
 

Nycturne

Elite Member
Posts
1,137
Reaction score
1,484
Apple publicly said they’d complete the transition in about two years, and they’ve already missed that deadline. At this point, they have to announce an M Mac Pro or formally kill it. Apple is in a bind either way. A Mac Pro now has to distinguish itself from the Studio. And if they retire the Pro, or don’t mention it, it’ll look like they can’t execute on desktops.

Especially after the lashing they got for the huge gap between the 2013 and 2019 Mac Pros. It's not a great look to have back to back delays on what is essentially a flagship product, even if it is a niche one.

It'd almost be funny if it wasn't so sad. As we've discussed previously, I am also a desktop user and my next mac will "probably" be an M3 Studio. Lately however, I've been having...thoughts. When I see how regular the laptop cadence is, I think to myself that I should just accept that portables are where Apple's interest and heart lay, and given that, I should just get the next MacBook Pro 16" when it arrives. It would save me the worry of waiting to see when Apple neglects/kills my desktop machine of choice!

I hear that. I went the laptop route because I can just have a dock on my desk, a nice big monitor, and take my single machine with me. Not everyone wants that. But one reason I used to have an iMac and MacBook of some kind was noise. But now the M# Max inside the 16" MBP fits into the same space that the iMac used to: plenty of performance and low noise/heat.

For those who never work away from their desk, I wouldn't recommend it. Why pay for a battery and screen you'll never use? The Studio is cheaper for the same capability, and doesn't need a TB dock.
 

Jimmyjames

Site Champ
Posts
652
Reaction score
744
Apologies if this has already been discussed, but do we think a putative M2 Max, Ultra in the Studio will be clocked the same as the laptop chips, or is there any possibility of clocking them higher?
 

Jimmyjames

Site Champ
Posts
652
Reaction score
744
I assume higher. M2 has claock headroom.
Many thanks.

I wonder if an upclocked M2 is going to cut it for many. Obviously it depends on the performance they achieve, but Intel and AMD aren't sitting still and Arm have announced their Cortex X4 which makes some impressive claims. I am curious how Apple managed to have their plans delayed more than other companies?
 

Jimmyjames

Site Champ
Posts
652
Reaction score
744
I'm not saying this is a credible score (quite the opposite) but for the sake of completeness, here is a supposed Mac Pro score on geekbench


Lol, the bottom score says ~141000 the top score says 312000! Fake.
 

quarkysg

Power User
Posts
69
Reaction score
45
I'm not saying this is a credible score (quite the opposite) but for the sake of completeness, here is a supposed Mac Pro score on geekbench


Lol, the bottom score says ~141000 the top score says 312000! Fake.
M2 Ultra with only 8P/4E looks fake to me. That should be M2 Max.
 

dada_dave

Elite Member
Posts
2,138
Reaction score
2,126
Many thanks.

I wonder if an upclocked M2 is going to cut it for many. Obviously it depends on the performance they achieve, but Intel and AMD aren't sitting still and Arm have announced their Cortex X4 which makes some impressive claims. I am curious how Apple managed to have their plans delayed more than other companies?
From what I gather we won’t be seeing X4 chips until after the M3. It generally takes awhile between ARM’s announcement and the appearance of implementations in the wild. And my understanding, perhaps wrong, is that their numbers are based on access to an N3 or equivalent node. Basically it’s going to be awhile before we see them. So while ARM I think has been a touch faster than Apple in microarchitecture releases recently, those are paper launches not when those chips are available to buy.

And Intel and AMD haven’t been actually any faster than Apple in terms of microarchitecture revisions either. Since November 2020, Apple released Firestorm/Icestorm and Blizzard/Avalanche while Intel has released Golden Cove/Gracemont and Raptor Cove/Gracemont. AMD released Zen 3 and Zen 4. Zen 5 will release next year (with maybe an “E core” variant that actually goes into a consumer product) and … it’s not clear when Redwood Cove/Crestmont will be released. Different sites at different times have different timelines with again differences between Intel’s mobile and desktop lines so we’ll see. But that combo probably wont be seen before the A17/M3 debuts with its new microarchitectures or at the earliest around the same time. Basically they’re all on similar cadences for microarchitecture releases.

Obviously AMD/ARM/Intel released many more chip variants in that time (and released most of those closer together), but Apple’s chip portfolio is simply smaller/more streamlined. They’re only releasing and designing chips for their own set of hardware.

Edit: I don’t dispute that Apple has been frustratingly slow here but that has less to do with microarchitecture than taking a long time between releasing chips for major portions of its product lineup and still not having released a chip for a small but vital part of that lineup. I think they probably banked on having access to N3 much earlier, but you’re right that they can’t blame everything on that alone.
 
Last edited:

Colstan

Site Champ
Posts
822
Reaction score
1,124
“And there’s one more mac left…which is a story for another day. When Colstan isn’t listening. Nobody tell Colstan.”
Apple will be covering the latest iteration of the Apple TV, this time with 8K support, and I'll step away as a result of a severe case of monotonous boredom, having hit the mute button, while idly playing with my nephew's pro wrestling action figures. During those 37 seconds, the Apple Silicon Mac Pro will be announced, I'll miss it entirely, only to come here on TechBoards and bitch about it for multiple pages, as is tradition. Nobody will do me the courtesy of telling me of the announcement, and just play along with my misery, as is tradition.

My grousing aside, I'm hoping that WWDC will finally put an end to my Mac vs. PC decision making process. While I've always been likely to stay with the fruit company for my desktop computing needs, that has become more likely over the past few weeks. My three greatest concerns:

1. The ability to play Alan Wake 2, the sequel to my all-time favorite computer game. Details of the game have finally been announced and a release date of October 17th has been set, just in time for Halloween. Much to my dismay, the game will feature two campaigns, with the returning hero being playable in only about 50% of the game, and only appearing in locations from the expansion, areas which I very much disliked.

Alan Wake 2 has gone from "crawl across broken glass" to "wait for reviews" since that reveal. Perhaps this is actually a good thing for me, personally, because it would have been insanity to build an entire PC just for one game. Now that the irrational exuberance has subsided, the imperative to build a PC, for this specific purpose, has waned considerably.

2. As we have covered in the gaming forum, the number of native Apple Silicon titles announced for Mac before WWDC has been significantly more substantial than I had expected. (Something that @Cmaier predicted over a year ago.)

I've always said that I don't need access to the entire Windows PC library, just enough to keep me entertained. I would say that 80% of the games that I play are turn-based isometric RPGs, and essentially all of those have Mac native versions already, including the upcoming Baldur's Gate 3. Of the remaining 20%, I prefer horror and science fiction, which just so happen to be plentiful among the upcoming Mac games.

On top of that, as @dada_dave has been extensively covering, the Asahi Linux team have made remarkable progress with support for Windows games, and continue their endeavors with Proton compatibility for their distro. As I just mentioned in that thread, CodeWeavers have announced initial support for DirectX 12 in CrossOver, thus expanding gaming options further.

3. Finally, concerning @dada_dave's words of wisdom as written in the above post...

I don’t dispute that Apple has been frustratingly slow here but that has less to do with microarchitecture than taking a long time between releasing chips for major portions of its product lineup and still not having released a chip for a small but vital part of that lineup. I think they probably banked on having access to N3 much earlier, but you’re right that they can’t blame everything on that alone.
Assuming our resident one-armed CPU architect is correct, there's a good chance that we will know more about the Apple Silicon Mac Pro by the time WWDC is over, being the final piece of the transition puzzle. As I've been bellyaching about, I'm concerned that Apple will punt its GPU efforts back to AMD, which is a solution that I find disquieting. I'm not going to purchase a Mac Pro just to play games. I would purchase a M(x) Pro Mac mini or M(x) Max Mac Studio with upgraded graphics for such activities. If Apple's solution for performance GPUs is to go crawling back to AMD, then lower-tier Macs would be stuck with eGPU options, which fits in the "never again" category for me. At this point, that's the only disqualifier.

Having chatted with a number of big brain folks here about the issue, both publicly and privately, my understanding is that the likelihood of Apple using third-party GPUs is dubious at best. As an example, over at the fever dream realm known as the MacRumors forum, the esteemed @leman just posted the following on the matter:

The design of Apple Silicon precludes separate GPUs (as it breaks the programming model) and to lesser degree RAM expansion (which is still potentially possible as a multi-tier RAM).

That's the short version of what I've heard from multiple smart folks on the subject. There are technical, financial, and cultural reasons that Apple is unlikely to use AMD GPUs, but until the Apple Silicon Mac Pro is finally announced, I'll still have a nagging feeling in the back of my skull. From my perspective, if Apple has confidence in their ability to release performant GPUs, then I'll have confidence in purchasing a new Mac as my next computer.

Thus far, all signs point to me firmly staying on the Apple ranch, which is a huge relief. It also means that, with my continued use of Macs, I won't become obsolete on these forums. Whether that is a good thing or a bad thing depends upon one's opinion on the value of my posts. Ha!
 

Aaronage

Power User
Posts
144
Reaction score
212
Re: the feeling that Apple is being slow.

So far, each model (base, Pro/Max, Ultra) has seen an update within 15-18 months:

October 2014: A8X
September 2015: A9X (11 months.after A8X - unusually short!)
June 2017: A10X (21 months after A9X)
November 2018: A12X (17 months after A10X)
November 2020: M1 (24 months after A12X)
October 2021: M1 Pro/Max (first Pro/Max)
March 2022: M1 Ultra (first Ultra)
June 2022: M2 (19 months after M1)
January 2023: M2 Pro/Max (15 months after M1 Pro/Max)
Expected -> June 2023: M2 Ultra (15 months after M1 Ultra)
Expected -> November 2023: M3 (17 months after M2)

Couple notes on the above:
  • Included the previous “X” series as part of the Mx family (Mx is clearly a continuation of “X”, I’m happy to die on this hill lol)
  • Ignored A12Z as it was just A12X warmed over.
  • The previous “X” series was seeing updates every 17 to 24 months (removing A9X as an outlier).
  • The current M-series SoCs are seeing updates every 15 to 19 months (more frequent than the previous “X” series)
The frequency of updates is where I expected it to be to be honest (I always assumed the "X" series cadence would apply personally).

It would be nice to see each full generation delivered within a shorter timespan (as in, M1 Ultra arrived 16 months after M1), but I imagine that’s simply a consequence of the balancing act Apple has to perform (there are some constraints even Apple can't avoid 🙂)
 

dada_dave

Elite Member
Posts
2,138
Reaction score
2,126
Apple will be covering the latest iteration of the Apple TV, this time with 8K support, and I'll step away as a result of a severe case of monotonous boredom, having hit the mute button, while idly playing with my nephew's pro wrestling action figures. During those 37 seconds, the Apple Silicon Mac Pro will be announced, I'll miss it entirely, only to come here on TechBoards and bitch about it for multiple pages, as is tradition. Nobody will do me the courtesy of telling me of the announcement, and just play along with my misery, as is tradition.

My grousing aside, I'm hoping that WWDC will finally put an end to my Mac vs. PC decision making process. While I've always been likely to stay with the fruit company for my desktop computing needs, that has become more likely over the past few weeks. My three greatest concerns:

1. The ability to play Alan Wake 2, the sequel to my all-time favorite computer game. Details of the game have finally been announced and a release date of October 17th has been set, just in time for Halloween. Much to my dismay, the game will feature two campaigns, with the returning hero being playable in only about 50% of the game, and only appearing in locations from the expansion, areas which I very much disliked.

Alan Wake 2 has gone from "crawl across broken glass" to "wait for reviews" since that reveal. Perhaps this is actually a good thing for me, personally, because it would have been insanity to build an entire PC just for one game. Now that the irrational exuberance has subsided, the imperative to build a PC, for this specific purpose, has waned considerably.

2. As we have covered in the gaming forum, the number of native Apple Silicon titles announced for Mac before WWDC has been significantly more substantial than I had expected. (Something that @Cmaier predicted over a year ago.)

I've always said that I don't need access to the entire Windows PC library, just enough to keep me entertained. I would say that 80% of the games that I play are turn-based isometric RPGs, and essentially all of those have Mac native versions already, including the upcoming Baldur's Gate 3. Of the remaining 20%, I prefer horror and science fiction, which just so happen to be plentiful among the upcoming Mac games.

On top of that, as @dada_dave has been extensively covering, the Asahi Linux team have made remarkable progress with support for Windows games, and continue their endeavors with Proton compatibility for their distro. As I just mentioned in that thread, CodeWeavers have announced initial support for DirectX 12 in CrossOver, thus expanding gaming options further.

3. Finally, concerning @dada_dave's words of wisdom as written in the above post...


Assuming our resident one-armed CPU architect is correct, there's a good chance that we will know more about the Apple Silicon Mac Pro by the time WWDC is over, being the final piece of the transition puzzle. As I've been bellyaching about, I'm concerned that Apple will punt its GPU efforts back to AMD, which is a solution that I find disquieting. I'm not going to purchase a Mac Pro just to play games. I would purchase a M(x) Pro Mac mini or M(x) Max Mac Studio with upgraded graphics for such activities. If Apple's solution for performance GPUs is to go crawling back to AMD, then lower-tier Macs would be stuck with eGPU options, which fits in the "never again" category for me. At this point, that's the only disqualifier.

Having chatted with a number of big brain folks here about the issue, both publicly and privately, my understanding is that the likelihood of Apple using third-party GPUs is dubious at best. As an example, over at the fever dream realm known as the MacRumors forum, the esteemed @leman just posted the following on the matter:



That's the short version of what I've heard from multiple smart folks on the subject. There are technical, financial, and cultural reasons that Apple is unlikely to use AMD GPUs, but until the Apple Silicon Mac Pro is finally announced, I'll still have a nagging feeling in the back of my skull. From my perspective, if Apple has confidence in their ability to release performant GPUs, then I'll have confidence in purchasing a new Mac as my next computer.

Thus far, all signs point to me firmly staying on the Apple ranch, which is a huge relief. It also means that, with my continued use of Macs, I won't become obsolete on these forums. Whether that is a good thing or a bad thing depends upon one's opinion on the value of my posts. Ha!
Given the rumors I agree that a release of Mac Studios and a Mac Pro announcement with a release later in the year (which as you pointed out before is how they handled it previously) seems very likely. So I’d say most of Gurman’s earlier predictions for hardware seem pretty off base which I, and others, already opined on at length before.

As far as the quote about e/dGPUs and RAM I think that’s a @leman quote? I would push back slightly that the programming model precludes using e/dGPUs, after all Metal still works fine on AMD e/dGPUs, but it’s clearly not the direction Apple are moving in. Is it possible that Apple might allow e/dGPUs especially on the Mac Pro? Yes but it’s unlikely and if so probably with the caveat that either AMD/Nvidia would have to do it on their own (so in practical terms: technically feasible but DOA as neither are likely to go to the bother of writing drivers which Apple has also made harder to deploy) or official AMD support being very limited with obvious implication that this is just to ease professionals transitioning and not something that would continue. Given that Apple will want everyone on their unified memory model and Apple GPUs have been around long enough that a lot of software has or will be soon transitioning to support, this last option of official AMD GPUs for the Mac Pro would be … surprising - not impossible, but very, very unlikely. If it does happen again it would primarily to ease the transition of professionals off of their current tools but I’m guessing Apple is just going to rip the bandaid off here. I’m not sure how much of a bandaid is actually left to rip to be honest? (Beyond the holdouts in the Macrumors forums)

All that really remains to be seen is whether or not the Mac Pro has a new Blizzard/Avalanche based SOC or just a souped up M2 Ultra or next gen CPU/GPU cores and whether or not it has ray tracing either as part of the GPU or as a dedicated accelerator. I’m hoping that if an M2 Ultra is going into the Mac Pro that there is another chip tier above that. But we’ll see!
 

dada_dave

Elite Member
Posts
2,138
Reaction score
2,126
Re: the feeling that Apple is being slow.

So far, each model (base, Pro/Max, Ultra) has seen an update within 15-18 months:

October 2014: A8X
September 2015: A9X (11 months.after A8X - unusually short!)
June 2017: A10X (21 months after A9X)
November 2018: A12X (17 months after A10X)
November 2020: M1 (24 months after A12X)
October 2021: M1 Pro/Max (first Pro/Max)
March 2022: M1 Ultra (first Ultra)
June 2022: M2 (19 months after M1)
January 2023: M2 Pro/Max (15 months after M1 Pro/Max)
Expected -> June 2023: M2 Ultra (15 months after M1 Ultra)
Expected -> November 2023: M3 (17 months after M2)

Couple notes on the above:
  • Included the previous “X” series as part of the Mx family (Mx is clearly a continuation of “X”, I’m happy to die on this hill lol)
  • Ignored A12Z as it was just A12X warmed over.
  • The previous “X” series was seeing updates every 17 to 24 months (removing A9X as an outlier).
  • The current M-series SoCs are seeing updates every 15 to 19 months (more frequent than the previous “X” series)
The frequency of updates is where I expected it to be to be honest (I always assumed the "X" series cadence would apply personally).

It would be nice to see each full generation delivered within a shorter timespan (as in, M1 Ultra arrived 16 months after M1), but I imagine that’s simply a consequence of the balancing act Apple has to perform (there are some constraints even Apple can't avoid 🙂)
There is some truth to this but the issue is that Apple themselves laid out a timeline that was far more aggressive, saying that the transition would take 2 years. Their actual transition is going to take at least 3 at this point and there have been weird gaps like the Mac mini taking forever to complete the transition, no M2 iMacs, delays with the MacBook pros, and of course no Mac Pro. No doubt much of this is due to external factors: a global pandemic and supply shock, difficultly with the new displays, silicon manufacturing delays and volume constraints at TSMC, etc … But it was Apple who set the expectations here and even if things had gone better their schedule was clearly too tight to execute on given even their considerable resources (more in that in the next paragraph).

There are also reports that while Apple’s SOC team was extremely happy with how their work turned out, they were simultaneously exhausted and overwhelmed. Even though Apple has a much smaller chip portfolio than AMD/Intel they had never before made anything like the Pro/Max/Ultra chips and the relatively small team compared to AMD/Intel was overextended getting them all designed, built, verified, etc ... Hopefully that level of crunch won’t be as bad going forward and many of these other issues will fade, but as remarkably smooth the transition has been in most respects, it’s not been without its challenges and delays.
 
Top Bottom
1 2