No “Extreme” chip coming to Mac Pro?

I find this believable. It's not like an M2 Extreme is going to look good next to Genoa or Saphire Rapids, and it won't Abel to compete with high-end GPUs either. Who would buy an extremely expensive underperforming tower? If Apple is serious about desktop they have to invest into vertical scaling and scaling in general. There is also still a severe software issue. You probably don't need a Mac Pro for photo or even video editing, the Studio does it all beautifully. And Apple Silicon currently underperforms for 3D rendering — a common market for larger workstations. What's left for the Mac Pro? Software development? Massive overkill. Number crunching? Not the best use of the money. GPU processing or ML work? Not mature or fast enough...

So yeah, from how things are going now, Apple either has to recalibrate their approach and come up with more powerful/scalable chips, or move to mobile only.
This whole thing is bizarre to me. It seems Apple has planned a very complex transition and yet without completing one full generation of chips, completely underestimated what was necessary for the high end desktops, and more importantly high end gpu performance.

So I'm wondering if it's Gurman's story that's incorrect, or Apple just doesn't care about desktops. If it's the latter, why bother with the 2019 Mac Pro?

I've said it previously, but I have surprised myself at how quickly my feelings on Apple Silicon have gone from absolute certainty that it's the future and the correct decision...to a growing sense of dread (like the 2013 Mac Pro) that they've taken a terrible wrong turn for more demanding users.
 
I've said it previously, but I have surprised myself at how quickly my feelings on Apple Silicon have gone from absolute certainty that it's the future and the correct decision...to a growing sense of dread (like the 2013 Mac Pro) that they've taken a terrible wrong turn for more demanding users.

I know what you mean. When M1 was first released it was obvious to me that this was a first-gen proof of concept chip, a simple application of their iPad technology and that they must have another family of chips in development that specifically target desktop. Two years later, these chips are still nowhere to be seen and it starts to look that maybe they were never a thing to begin with. Apples execution on mobile is great (although if they don’t iterate here they will be overtaken), but they don’t seem to have a concrete plain for desktop. The software stack is a mess as well. What’s happening?
 
An M1 die is still around 450mm2, four of those would be roughly 2000mm2, still smaller than a Threadripper. I doubt that the area is a problem per se here. It’s more that the entire infrastructure is set up around the GPU (large LLC and wide memory bus), these are all resources that the CPU doesn’t really need.

For Apple it would probably be beneficial to split the SoC into separate CPU and GPU ”extension clusters“, which would allow them to be more flexible with the final product. But who knows, its not a panacea either.
Since there's no other product that would benefit from separate GPU and CPU/NPU/Others, I'm unsure about the financial viability of doing that just for the Mac Pro. Although a separate GPU was indeed rumored. Also, some workloads can saturate memory bandwidth on Apple Silicon without using the GPU, so while there would have been more optimal solutions if CPU performance was the only concern, it's not like they go unused.

I don't know what this means. Does the its refer to the Ultra or the Mac Pro? If the latter, is he saying that the Mac Pro will offer the same 192 GB max unified RAM that's expected for the M2 Ultra but, unlike the Studio, will have secondary RAM that is expandable beyond that?
I think it's referring to the Mac Pro => secondary RAM.

This whole thing is bizarre to me. It seems Apple has planned a very complex transition and yet without completing one full generation of chips, completely underestimated what was necessary for the high end desktops, and more importantly high end gpu performance.
High end GPU performance was always going to be a tough one. On any other front, I think they delivered. A M1 Ultra Mac Studio (which is almost a year old at this point) is still on par (CPU-performance-wise) with the latest Intel CPU (i9-13900K), while using a core design that is now 2 generations old (A14 generation). Apple Silicon has a better shot an keeping up with significant YoY performance improvements than Intel/AMD.
On the GPU performance front, I think the best course of action is what Apple is currently doing: Investing development resources in improving the Metal support in as many open source projects as possible. Still, a 4-die M2 Extreme with 128 GPU cores should be enough to beat NVIDIA's 30XX series, right? It's not like they're several years down in performance.
 
High end GPU performance was always going to be a tough one. On any other front, I think they delivered. A M1 Ultra Mac Studio (which is almost a year old at this point) is still on par (CPU-performance-wise) with the latest Intel CPU (i9-13900K), while using a core design that is now 2 generations old (A14 generation). Apple Silicon has a better shot an keeping up with significant YoY performance improvements than Intel/AMD.
On the GPU performance front, I think the best course of action is what Apple is currently doing: Investing development resources in improving the Metal support in as many open source projects as possible. Still, a 4-die M2 Extreme with 128 GPU cores should be enough to beat NVIDIA's 30XX series, right? It's not like they're several years down in performance.

What they need is to operate the GPU at higher frequencies. M1 Ultra already has 8192 ALUs, that's more than the Nvidia 3080. But Nvidia runs at 1.5-1.7ghz while Apple runs at 1.26ghz
 
What they need is to operate the GPU at higher frequencies. M1 Ultra already has 8192 ALUs, that's more than the Nvidia 3080. But Nvidia runs at 1.5-1.7ghz while Apple runs at 1.26ghz
Who knows, maybe the M2 Pro/Max/Ultra/Extreme can already do that. The heatsink on the Mac Studio is big, given the power consumption of the M1 Ultra... could be future-proofing for hotter chips? We didn't get any leaked Metal bechmarks for the M2 Pro/Max family GPUs like we did for the CPU either, so there's not a lot of data to speculate on.
 
I was trying to keep away from tech news for a while, getting burned out on it, and wanted to enjoy the holiday season. Then Uncle Gurman had to go and drop another bomb on us.

the Mac Pro is expected to rely on a new-generation M2 Ultra chip (rather than the M1 Ultra) and will retain one of its hallmark features: easy expandability for additional memory, storage and other components.
Everyone is getting worked up over this vague quote. I suspect that's Gurman's intention. He doesn't say anything about external DIMMs being available inside the new Mac Pro, just talking about "hallmark features", which could simply be a generality about the Mac Pro's historical place in the product line. This is Gurman's modus operandi, he makes statements that can be interpreted in multiple ways, which then allows him to claim that he "called it" down the road. If that doesn't work, then his next go to is "plans change". We forget his botches before being distracted by his next shiny rumor.

I realize that's he's the best we've got as far as Apple leaks, but his best sources were burned years ago, and now he's running on fumes. He used to have codenames, precise specs, form factors, and launch dates. Now, all he has are the number of CPU/GPU cores, which anyone could figure out from the base M2. He did score a reveal with the new macOS System Settings, so I'll give him that. Other than that, how many times have we heard about new hardware in the "coming months" from him? Again, more vague bread crumbs that we tech nerds make into a fully baked loaf.

Concerning the quote above, here is my opinion, take it for what it is worth. I think Gurman put that vague reference in there for attention and clicks. The last Mac Pro prototype, which was leaked by the poster on MR who has a source who has physical access to those prototypes, has repeatedly said no DIMMs, no driver support for third-party GPUs. This follows the rest of the Apple Silicon line. The last update about that was on October 24th.

So, in less than two months, Apple has changed the Mac Pro to support DIMMs (or something similar)? I'm just an average tech enthusiast, I don't have @Cmaier's knowledge, but from what I gather, that's not something that can be tacked on within weeks. The only chance I can see this happening is if Apple had planned for it ahead of time and simply didn't include it inside the Mac Pro prototype from two months ago.

Or Gurman is simply running out of gas and that's why he keeps faltering. However, we need not worry, I'm sure there will soon be a Max Tech video to clear it all up for us.
 
I was trying to keep away from tech news for a while, getting burned out on it, and wanted to enjoy the holiday season. Then Uncle Gurman had to go and drop another bomb on us.


Everyone is getting worked up over this vague quote. I suspect that's Gurman's intention. He doesn't say anything about external DIMMs being available inside the new Mac Pro, just talking about "hallmark features", which could simply be a generality about the Mac Pro's historical place in the product line. This is Gurman's modus operandi, he makes statements that can be interpreted in multiple ways, which then allows him to claim that he "called it" down the road. If that doesn't work, then his next go to is "plans change". We forget his botches before being distracted by his next shiny rumor.

I realize that's he's the best we've got as far as Apple leaks, but his best sources were burned years ago, and now he's running on fumes. He used to have codenames, precise specs, form factors, and launch dates. Now, all he has are the number of CPU/GPU cores, which anyone could figure out from the base M2. He did score a reveal with the new macOS System Settings, so I'll give him that. Other than that, how many times have we heard about new hardware in the "coming months" from him? Again, more vague bread crumbs that we tech nerds make into a fully baked loaf.

Concerning the quote above, here is my opinion, take it for what it is worth. I think Gurman put that vague reference in there for attention and clicks. The last Mac Pro prototype, which was leaked by the poster on MR who has a source who has physical access to those prototypes, has repeatedly said no DIMMs, no driver support for third-party GPUs. This follows the rest of the Apple Silicon line. The last update about that was on October 24th.

So, in less than two months, Apple has changed the Mac Pro to support DIMMs (or something similar)? I'm just an average tech enthusiast, I don't have @Cmaier's knowledge, but from what I gather, that's not something that can be tacked on within weeks. The only chance I can see this happening is if Apple had planned for it ahead of time and simply didn't include it inside the Mac Pro prototype from two months ago.

Or Gurman is simply running out of gas and that's why he keeps faltering. However, we need not worry, I'm sure there will soon be a Max Tech video to clear it all up for us.

The problem with DIMMs is that you then have some memory accesses which take a thousand times longer than others, depending on whether the address is in the DIMM or in the package. So if you are going to do this, the engineering solution is to create a complicated memory controller that essentially turns the package memory into a cache. Either a real cache (where each entry in package memory has a corresponding entry in the DIMM memory, so your total memory is really the size of the DIMM memory) or, with much more difficulty, you shuffle things back and forth based on use, so your total memory is DIMM memory plus package memory. No way they do the latter.

The former is not conceptually difficult, but does require engineering a fairly complex bunch of logic with some interesting timing implications. I am pretty sure they’re not doing that either, because that would be something new on the chip, and they really just want to tile existing chips together and not make special silicon for the Pro.

Who knows.
 
What's left for the Mac Pro? Software development? Massive overkill. Number crunching? Not the best use of the money. GPU processing or ML work? Not mature or fast enough...

So yeah, from how things are going now, Apple either has to recalibrate their approach and come up with more powerful/scalable chips, or move to mobile only.
I hope that the problem is entirely technical and that a future generation of Mac Pro will feature an "Extreme" SoC, heralding Apple Silicon's triumphant victory, as it plants its flag over the fallen bodies of Xeon and EPYC.

However, I think the real explanation is that the Mac Pro is becoming a product looking for a market. Apple has been struggling with the Mac Pro since 2013. The trash can was a fancy art piece, but it didn't meet the needs of the target user. Then along came the iMac Pro, which I think was supposed to be the replacement for the Mac Pro, but customers wouldn't have any of it. Apple did the "apology tour", released the 2019 Mac Pro, but at a crazy price. The base model, for $6,000 USD, came with an anemic 256GB SSD after all. They had to do major upsells to turn a profit.

That was on the high-volume Xeon platform provided by Intel. Apple just designed the case and the oddball MPX modules. They released it because they were likely bleeding high-end pro customers to Windows and Linux workstations. That's probably gotten worse in the past few years. Now, they've likely lost some of the customers that would have purchased the Apple Silicon Mac Pro to the Mac Studio already. I think Gurman's old report of there being a big and small Mac Pro reflects this. The small version was the Mac Studio, and some of the folks waiting for the Mac Pro decided to bite early.

Now, Apple doesn't have the Xeon platform volume to depend upon. Assuming that the "Extreme" SoC ever existed somewhere other than Gurman's articles, how many of those would Apple have sold? I don't think it's a stretch to think not many. On top of that, many users in that market want third-party GPUs and DIMM slots (something that the crowd at MR is having an ongoing food fight over). That's a lot of added work for a product that probably sells less than 100K units per year, if that.

I think Apple has spent the past decade trying to figure out what to do about the Mac Pro. They begrudgingly keep it on as a flagship model, but profitability has become harder to attain. My guess, and this is just a guess, is that the Apple Silicon Mac Pro will now feature an M2 Ultra, 6 PCIe slots, no DIMM slots, no GPU upgrades of any sort, and Apple will tell us it's "the best Mac Pro we've ever made" inside a new fancy case. Oh, and "we think you're going to love it", then the few remaining pro users, who haven't migrated to Windows or Linux, will go vent on MR to teach Apple a lesson and wait for another apology that will never come.

As tech nerds, the Mac Pro is infinitely interesting, because it's the one tantalizing model that hasn't been updated yet. We want something that is new and delightful, not predictable. However, it's a product which few will ever purchase, and Apple knows that. I think the Mac Pro is more important to those of us who will never buy one than Apple as a business. Having a halo product is nice, but not if the investment is way beyond what it is worth in finances and engineering effort.

Again, that's just my pessimistic take, I very much hope I am wrong and that Apple has grand plans for taking the desktop performance crown. I'm just not seeing it, as of yet.
 
And Apple Silicon currently underperforms for 3D rendering — a common market for larger workstations. What's left for the Mac Pro?
I'm curious what the implications of this underperformance are for its upcoming AR/VR headset.

How does 3D rendering work with these headsets? Are scenes pre-rendered in 3D on workstations like the Mac Pro, or does the headset itself render in 3D in real time, or is it a combination of the two?
 
Last edited:
I'm curious what the implications of this underperformance are for its upcoming AR/VR headset.

How does 3D rendering work with these headsets? Are scenes pre-rendered in 3D on workstations like the Mac Pro, or does the headset itself render in 3D in real time, or is it a combination of the two?

It’s a different kind of 3D rendering :) I have little doubt that Apple will be an AR/VR leader. They have the best perf/watt on the market (although Qualcomm is catching up fast) as well as specialized features geared towards headsets and efficient operation (render target compression, variable rate rasterization…).
 
However, we need not worry, I'm sure there will soon be a Max Tech video to clear it all up for us.
Sorry for quoting myself, but I must be Svengali. At the same I posted about Uncle Gurman, Max Tech was hard at work, the rumormongering wouldn't be complete without Vadim.

 
I'm curious what the implications of this underperformance are for its upcoming AR/VR headset.

How does 3D rendering work with these headsets? Are scenes pre-rendered in 3D on workstations like the Mac Pro, or does the headset itself render in 3D in real time, or is it a combination of the two?
While initially headsets received their data from a desktop computer, increasingly developers are moving away from that design, as it requires tethering to the computer. Thus encapsulating the graphics into the head set itself while being able to drive the resolution/frame rates needed is the holy grail. This requires low power, high enough performance which is where Apple actually shines. Whether their own rumored to be very expensive headsets will be able to deliver is a separate matter.

@leman Is referring to professional GPU scene rendering (think 3D animation movies) where Apple is good up to a point and then the lack of ray tracing and raw power really hurts. Ironically this should be an area where Apple shines as their unified memory is a big advantage here, but alas …

Edit: and I see @leman already responded with basically the same answer. 🙂
 
It’s a different kind of 3D rendering :) I have little doubt that Apple will be an AR/VR leader. They have the best perf/watt on the market (although Qualcomm is catching up fast) as well as specialized features geared towards headsets and efficient operation (render target compression, variable rate rasterization…).

Apple has also been collaborating with Stanford University's AR/VR laboratory for the last seven years. I suspect what will be released will be somewhat similar to what another Stanford collaborator presented at SIGGRAPH last August. And that the heavy-lift processing will be iPhone driven via wireless data link (perhaps via a special UWB mode that Apple can leverage from already employing UWB in iPhone).

No use having another A-series cpu/gpu and bulky battery in the glasses/headset when a custom chip/ASIC and much smaller battery will do to handle the video streams. And a user's iPhone already has WiFi/cellular/storage/apps/etc to connect to the outside world for data/documents/etc.
 
Last edited:
Sorry for quoting myself, but I must be Svengali. At the same I posted about Uncle Gurman, Max Tech was hard at work, the rumormongering wouldn't be complete without Vadim.


Oh dear. Another video in which he repeats information he knows to be nonsense. The 32MB TLB garbage he previously referred to is back again. Many told him on Twitter, so now I have to assume he acting in bad faith.
 
Oh dear. Another video in which he repeats information he knows to be nonsense. The 32MB TLB garbage he previously referred to is back again. Many told him on Twitter, so now I have to assume he acting in bad faith.
My all time favorite is when Vadim claimed that Armv9 would lead to a 25% performance improvement in Apple Silicon. He had conflated Arm's claimed uplift for the next generation Cortex cores with the announcement of the new instruction set. The guy is as carnie as it gets and doesn't care to learn.
 
I swear I didn't read this article before I posted my pessimistic outlook on the Mac Pro a few hours ago, but Ars Technica is also following my line of thinking. In fact, they're even more down on the machine than I am, and unlike myself, provide stats to back up their opinions.


I've recently noticed a decidedly negative tone regarding the Apple Silicon transition, particularly when it comes to desktops, among Mac enthusiasts. All I can say is that I hope Apple proves us wrong and that what we are witnessing are just hiccups. I'd give them somewhat of a pass because of global events, which we are all aware of, but their competitors haven't stood still. I still get the sense that there is something wonky going on with the fruit company's silicon initiative. Wether that is technical, strategic, financial, or a combination of any of those, I'm uncertain. Regardless, we're well past the two year deadline, so I'd like for Tim Cook and Johny Srouji to show us what they've got.
 
I swear I didn't read this article before I posted my pessimistic outlook on the Mac Pro a few hours ago, but Ars Technica is also following my line of thinking. In fact, they're even more down on the machine than I am, and unlike myself, provide stats to back up their opinions.


I've recently noticed a decidedly negative tone regarding the Apple Silicon transition, particularly when it comes to desktops, among Mac enthusiasts. All I can say is that I hope Apple proves us wrong and that what we are witnessing are just hiccups. I'd give them somewhat of a pass because of global events, which we are all aware of, but their competitors haven't stood still. I still get the sense that there is something wonky going on with the fruit company's silicon initiative. Wether that is technical, strategic, financial, or a combination of any of those, I'm uncertain. Regardless, we're well past the two year deadline, so I'd like for Tim Cook and Johny Srouji to show us what they've got.
I'm definitely part of that negativity. I remember when the M1 was announced at WWDC and one prominent Apple dev stated "be excited for the M1, but be even more excited by what will happen to the Mac in the next few years". I'll see if I can find that tweet, in part to see how accurate my memory is, but that was the essence.

I sometimes wonder if Apple's aware of what people think of the desktop situation, and how poor it is for them. only a handful of meaningful upgrades in 10 years. Trash can MP, iMac Pro, 2019 Mac Pro, Studio. Is that acceptable? Do they just want portables and they are too scared to say it?

At times I'm almost ready to give credence to those talking about the exodus that happened when Gerard Williams left and how it has slowed down progress significantly. Things do seem to have slowed since the A13. Worrying times when crazy MacRumors posters were more accurate than me! lol.

EDIT: found the tweet
"Congratulations to my colleagues on moving the Mac to Apple silicon -- an infrastructural effort beyond anything I've ever seen. This is a skyscraper whose plumbing really deserves a lot of appreciation. Be excited for the Mac this year. Be more excited for the Mac after that."
 
It’s a different kind of 3D rendering :) I have little doubt that Apple will be an AR/VR leader. They have the best perf/watt on the market (although Qualcomm is catching up fast) as well as specialized features geared towards headsets and efficient operation (render target compression, variable rate rasterization…).

While initially headsets received their data from a desktop computer, increasingly developers are moving away from that design, as it requires tethering to the computer. Thus encapsulating the graphics into the head set itself while being able to drive the resolution/frame rates needed is the holy grail. This requires low power, high enough performance which is where Apple actually shines. Whether their own rumored to be very expensive headsets will be able to deliver is a separate matter.

@leman Is referring to professional GPU scene rendering (think 3D animation movies) where Apple is good up to a point and then the lack of ray tracing and raw power really hurts. Ironically this should be an area where Apple shines as their unified memory is a big advantage here, but alas …

Edit: and I see @leman already responded with basically the same answer. 🙂
I don't know how this works—obviously!—but I was imagining they might pre-render certain costly (costly to render) objects using powerful machines, and insert them as modules into the AR/VR code so that they don't have to be rendered from scratch by the headset.

I.e., I'm thinking of a 3D sprite or a 3D cutscene, by analogy to what they use for 2D games:

And if you did need powerful machines to prerender 3D sprites or cutscenes for an Apple AR/VR game, would that need to be done in MacOS, or could it be done on any machine and then dropped into the AR/VR code?

Separately, could another possible market for the Mac Pro be those doing development work for ARM-based supercomputers, particularly since it will probably be the only commercially-available ARM-based workstation for some time?

Or would they need to do the dev work on the same microarchitecture used in the supercomputer? If the latter, I'd imagine a typical ARM supercomputer package would come with several custom development workstations made using the same processors as those in the supercomputer itself.
 
Last edited:
"Congratulations to my colleagues on moving the Mac to Apple silicon -- an infrastructural effort beyond anything I've ever seen. This is a skyscraper whose plumbing really deserves a lot of appreciation. Be excited for the Mac this year. Be more excited for the Mac after that."
Some of this may be the "irrational exuberance" of the moment. I too was excited by the release of Apple Silicon, but I can't say I ever put it in such glowing terms. Gurman, for what it is worth, has stated that he thinks that the M2 is a stopgap, in his usual vague manner. It certainly feels like one, but maybe we were expecting a bit much after the sizable jump from Intel. As talented as Apple is, from its leadership to its engineering team, I still get the sense that something is off about the company's progress with Apple Silicon. I don't think it can be entirely blamed on global events, because their competitors continue to execute, more or less on schedule. Now, it looks like we'll be waiting until March for new Macs, when Apple historically makes such an announcement. It may be June at WWDC when the next Mac Pro is finally released, which will be year three of Tim Cook's two year timeline.
 
Back
Top