Mac Pro - no expandable memory per Gurman

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,216
Reaction score
8,267

Yeah, that’s what I thought.

Maybe, at least, the CPU will be on an easily replaceable board. Then you could upgrade memory in the future by swapping the entire SoC package. But that’s about as far as Apple could go without doing a lot of engineering work on a special CPU that can treat the on-package memory as some sort of cache, or do hardware virtual memory mapping and treating the off-package RAM as a RAM-disk.

Somewhat surprised that allowing graphic cards seems to be in the works. Not a huge technical problem but someone will have to keep writing drivers.
 

Jimmyjames

Site Champ
Posts
634
Reaction score
708

Yeah, that’s what I thought.

Maybe, at least, the CPU will be on an easily replaceable board. Then you could upgrade memory in the future by swapping the entire SoC package. But that’s about as far as Apple could go without doing a lot of engineering work on a special CPU that can treat the on-package memory as some sort of cache, or do hardware virtual memory mapping and treating the off-package RAM as a RAM-disk.

Somewhat surprised that allowing graphic cards seems to be in the works. Not a huge technical problem but someone will have to keep writing drivers.
I found this entire report to be confusing. I had always thought given Apple's emphasis on unified memory, a replaceable gpu would have been much harder than replaceable ram. Perhaps I was incorrect. In any case I would have thought ram would be a more popular upgrade than a gpu.

I'm left thinking there doesn't seem much point to this Mac Pro. No upgradeable ram, no Extreme soc. If true, they've taken the Studio and put it in the Mac Pro enclosure with few benefits of the Intel Mac Pro. Who would want that?

Given the rest of Gurman's report, I'm left with even less enthusiasm for Apple's latest moves. Previously the Mac had to suffer due to iOS being a priority. Now everything has to make way for the VR headset. Why must the new hot thing take priority over everything else. I am interested in the Mac. I don't care about VR. If Apple can't keep a minimum level of interest in the Mac for more than a few months, why should I bother giving them money?
 

dada_dave

Elite Member
Posts
2,071
Reaction score
2,053
I found this entire report to be confusing. I had always thought given Apple's emphasis on unified memory, a replaceable gpu would have been much harder than replaceable ram. Perhaps I was incorrect. In any case I would have thought ram would be a more popular upgrade than a gpu.

I'm left thinking there doesn't seem much point to this Mac Pro. No upgradeable ram, no Extreme soc. If true, they've taken the Studio and put it in the Mac Pro enclosure with few benefits of the Intel Mac Pro. Who would want that?

Given the rest of Gurman's report, I'm left with even less enthusiasm for Apple's latest moves. Previously the Mac had to suffer due to iOS being a priority. Now everything has to make way for the VR headset. Why must the new hot thing take priority over everything else. I am interested in the Mac. I don't care about VR. If Apple can't keep a minimum level of interest in the Mac for more than a few months, why should I bother giving them money?
Discrete GPUs are a much easier engineering problem, just a question of drivers, than expandable RAM with the performance of soldered RAM. All the proposed solutions led to increased complexity at some level of the memory hierarchy. I should point out that there is a prototype from a company, I can’t remember which one, to in effect make user replaceable lpDDR modules and have that become a standard. But I can’t remember the status.

The Mac Pro allowing for discrete graphics cards makes a certain amount of sense. It’s a small percentage of the user market so it doesn’t really dilute the emphasis on unified memory for everything else and the professionals buying it are more likely to also have particular needs.

However, simply because the main issue is just drivers, doesn’t mean this is going to happen. Nvidia is on the outs with everybody, especially Apple, and while AMD has worked with Apple in the past, this is would be a product for just this segment. They may not view it as worth it. We’ll just to have to see.
 

Jimmyjames

Site Champ
Posts
634
Reaction score
708
Discrete GPUs are a much easier engineering problem, just a question of drivers, than expandable RAM with the performance of soldered RAM. All the proposed solutions led to increased complexity at some level of the memory hierarchy. I should point out that there is a prototype from a company, I can’t remember which one, to in effect make user replaceable lpDDR modules and have that become a standard. But I can’t remember the status.

The Mac Pro allowing for discrete graphics cards makes a certain amount of sense. It’s a small percentage of the user market so it doesn’t really dilute the emphasis on unified memory for everything else and the professionals buying it are more likely to also have particular needs.

However, simply because the main issue is just drivers, doesn’t mean this is going to happen. Nvidia is on the outs with everybody, especially Apple, and while AMD has worked with Apple in the past, this is would be a product for just this segment. They may not view it as worth it. We’ll just to have to see.
Interesting thanks.

I had thought that given the emphasis on unified memory, they wouldn't dilute that for a product that sells in in such small numbers. Wouldn't that be an admission of failure to compete on the high end?
 

dada_dave

Elite Member
Posts
2,071
Reaction score
2,053
Interesting thanks.

I had thought that given the emphasis on unified memory, they wouldn't dilute that for a product that sells in in such small numbers. Wouldn't that be an admission of failure to compete on the high end?
Perhaps but you can argue the other way that they are catering to the needs of professionals without diluting their consumer focus much. Also d or eGPUs could be limited to compute or in some other way as to not interfere with Apple’s core strategy. For instance, let’s say Apple’s SOC can theoretically talk to such GPUs but only over Linux/Windows (virtualized or metal with Asahi) since there are no drivers otherwise.

Finally we don’t really know what’s going to happen. This latest prediction makes the most sense from an engineering standpoint but who knows? This guy was saying different stuff just a few weeks ago with regards to user replaceable RAM. The Oracle at Delphi he ain’t.

I agree that the lack of an extreme SOC this generation, if accurate, would be quite disappointing. I probably wouldn’t have the resources to buy one myself depending on the cost but it still would’ve been nice to see.
 

Jimmyjames

Site Champ
Posts
634
Reaction score
708
Perhaps but you can argue the other way that they are catering to the needs of professionals without diluting their consumer focus much. Also d or eGPUs could be limited to compute or in some other way as to not interfere with Apple’s core strategy. For instance, let’s say Apple’s SOC can theoretically talk to such GPUs but only over Linux/Windows (virtualized or metal with Asahi) since there are no drivers otherwise.

Finally we don’t really know what’s going to happen. This latest prediction makes the most sense from an engineering standpoint but who knows? This guy was saying different stuff just a few weeks ago with regards to user replaceable RAM. The Oracle at Delphi he ain’t.

I agree that the lack of an extreme SOC this generation, if accurate, would be quite disappointing. I probably wouldn’t have the resources to buy one myself depending on the cost but it still would’ve been nice to see.
Yes, overall I find Gurman's most recent predictions weird. I get the sense he's grasping at straws.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,216
Reaction score
8,267
Interesting thanks.

I had thought that given the emphasis on unified memory, they wouldn't dilute that for a product that sells in in such small numbers. Wouldn't that be an admission of failure to compete on the high end?
If you stick a high-end AMD GPU in there, it will have higher performance than apple’s unified memory GPUs, simply because you can get more cores at a higher clock. The users don’t care about unified memory - if you are buying a Mac Pro you just want performance
 

Jimmyjames

Site Champ
Posts
634
Reaction score
708
If you stick a high-end AMD GPU in there, it will have higher performance than apple’s unified memory GPUs, simply because you can get more cores at a higher clock. The users don’t care about unified memory - if you are buying a Mac Pro you just want performance
So isn't that an admission that Apple can't compete on the high end?
 

B01L

SlackMaster
Posts
161
Reaction score
117
Location
Diagonally parked in a parallel universe...
Predictions, using speculative reasoning...

Base ASi Mac Pro:
  • M2 Ultra SoC (N3B)
  • 24-core CPU (16P/8E)
  • 60-core GPU
  • 32-core Neural Engine
  • 96GB LPDDR5 SDRAM
  • 800GB/s UMA bandwidth
  • 1TB NVMe SSD (2 @ 512GB NAND blades)
  • (6) PCIe slots
  • US$5999
Fully-Loaded ASi Mac Pro:
  • M2 Ultra SoC (N3B)
  • 24-core CPU (16P/8E)
  • 76-core GPU
  • 32-core Neural Engine
  • 192GB LPDDR5 SDRAM
  • 800GB/s UMA bandwidth
  • 8TB NVMe SSD (2 @ 4TB NAND blades)
  • (6) PCIe slots
  • US$9999

But what I want is...

Fully-Loaded ASi Mac Pro Cube:
  • M3 Extreme SoC (N3X)
  • 64-core CPU (48P/16E)
  • 240-core GPU (w/hardware ray-tracing)
  • 64-core Neural Engine
  • 1TB LPDDR5X SDRAM
  • 2.13TB/s UMA bandwidth
  • 32TB NVMe SSD (4 @ 8TB NAND blades)
  • 420W PSU
  • 7.7" x 7.7" x 7.7"
  • US$24,999
 

dada_dave

Elite Member
Posts
2,071
Reaction score
2,053
If you stick a high-end AMD GPU in there, it will have higher performance than apple’s unified memory GPUs, simply because you can get more cores at a higher clock. The users don’t care about unified memory - if you are buying a Mac Pro you just want performance
So isn't that an admission that Apple can't compete on the high end?

That’s certainly true if no extreme SOC ships but a unified memory GPU pound for pound would theoretically have far higher performance in a lot of professional applications than a discrete GPU as even with latency hiding, shuttling large data sets back and forth over PCIe is still a pain. We even see this with Apple’s current GPUs on large rendering projects. Being able to have large data sets of hundreds of gigabytes simply live in accessible memory is a massive advantage. But you’re also right that if Apple doesn’t ship a GPU with high enough cores or clocks or containing the cores capable of ray tracing/machine learning then that potential will be unrealized and the large powerful and power hungry discrete GPU will always eat their lunch.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,216
Reaction score
8,267
So isn't that an admission that Apple can't compete on the high end?
No, it’s an admission that they choose not to. They simply don’t want to put in engineering work on a product that is so niche. They sell very very few Mac Pros. Now that Mac Studio is a thing, they will sell even fewer Mac Pros. So unless the design work and mask set results in something that can be leverage into iMacs, high-end mac pros, etc., at this point they’ve decided it’s not worth it.

That said, I do expect them to compete at the ultra high end before too long, but when they do I expect it will be by partitioning the GPU and neural cores onto a separate chiplet that can be tiled separately from CPU cores in future SoC packages. So you have your CPU cores on one die, and your GPUs on another. Mac Pro gets a dozen of those GPU die, and macbook pro get two. Or whatever.
 

dada_dave

Elite Member
Posts
2,071
Reaction score
2,053
That said, I do expect them to compete at the ultra high end before too long, but when they do I expect it will be by partitioning the GPU and neural cores onto a separate chiplet that can be tiled separately from CPU cores in future SoC packages. So you have your CPU cores on one die, and your GPUs on another. Mac Pro gets a dozen of those GPU die, and macbook pro get two. Or whatever.

Yup this what I see happening too.
 

Jimmyjames

Site Champ
Posts
634
Reaction score
708
No, it’s an admission that they choose not to. They simply don’t want to put in engineering work on a product that is so niche. They sell very very few Mac Pros. Now that Mac Studio is a thing, they will sell even fewer Mac Pros. So unless the design work and mask set results in something that can be leverage into iMacs, high-end mac pros, etc., at this point they’ve decided it’s not worth it.

That said, I do expect them to compete at the ultra high end before too long, but when they do I expect it will be by partitioning the GPU and neural cores onto a separate chiplet that can be tiled separately from CPU cores in future SoC packages. So you have your CPU cores on one die, and your GPUs on another. Mac Pro gets a dozen of those GPU die, and macbook pro get two. Or whatever.
Hmmm. They choose not to, but they will add a third party gpu after ignoring them for 3 years, speaking constantly about the benefits of unified memory and their approach. I am in no way saying you're incorrect, but this doesn't feel like an approach that will yield much confidence from pros. Perhaps they don't care about pros, and that's ok, but it would be better if they were honest about it rather than carrying on this charade of saying "we care about pros" but then leaving them without high end gpu performance for the majority of 10 years.
 

dada_dave

Elite Member
Posts
2,071
Reaction score
2,053
Hmmm. They choose not to, but they will add a third party gpu after ignoring them for 3 years, speaking constantly about the benefits of unified memory and their approach. I am in no way saying you're incorrect, but this doesn't feel like an approach that will yield much confidence from pros. Perhaps they don't care about pros, and that's ok, but it would be better if they were honest about it rather than carrying on this charade of saying "we care about pros" but then leaving them without high end gpu performance for the majority of 10 years.
They haven’t really been ignoring them, current users of the Intel Mac Pro have gotten new GPU upgrade options as late as March 2022. It’s more of an acknowledgment that the transition is going to be slower for many professional applications than others which would be true regardless of hardware capabilities. And the hardware capabilities have come slower than expected.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,216
Reaction score
8,267
Hmmm. They choose not to, but they will add a third party gpu after ignoring them for 3 years, speaking constantly about the benefits of unified memory and their approach. I am in no way saying you're incorrect, but this doesn't feel like an approach that will yield much confidence from pros. Perhaps they don't care about pros, and that's ok, but it would be better if they were honest about it rather than carrying on this charade of saying "we care about pros" but then leaving them without high end gpu performance for the majority of 10 years.
They’re not “adding” a third party gpu. They are selling a machine with slots in it. We always knew that the Mac Pro would have such slots, otherwise it’s just a Mac Studio.
 

B01L

SlackMaster
Posts
161
Reaction score
117
Location
Diagonally parked in a parallel universe...
That said, I do expect them to compete at the ultra high end before too long, but when they do I expect it will be by partitioning the GPU and neural cores onto a separate chiplet that can be tiled separately from CPU cores in future SoC packages. So you have your CPU cores on one die, and your GPUs on another. Mac Pro gets a dozen of those GPU die, and macbook pro get two. Or whatever.

That would be the "GPU-specific" die I have previously espoused...

One "regular" die & one "GPU-specific" die make up a Mac Pro workstation Mn Ultra SoC...

Two "regular" & two "GPU-specific" dies make up a Mac Pro workstation Mn Extreme SoC...

Two, four, eight or more "GPU-specific" dies make up an Apple GPGPU card...

Display output from the "iGPU", compute/rendering from the GPGPU(s)...?
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,216
Reaction score
8,267
That would be the "GPU-specific" die I have previously espoused...

One "regular" die & one "GPU-specific" die make up a Mac Pro workstation Mn Ultra SoC...

Two "regular" & two "GPU-specific" dies make up a Mac Pro workstation Mn Extreme SoC...

Two, four, eight or more "GPU-specific" dies make up an Apple GPGPU card...

Display output from the "iGPU", compute/rendering from the GPGPU(s)...?
Could be.

In the end, the reason Mac Pro has always existed is because Apple had to admit that certain users needed more than Apple was willing to sell in a sealed box. They’ve always treated it like it was a niche market, and they are right - it’s a collection of MANY niche markets. They can never satisfy everyone with a sealed box, because even if they stuck the world’s most powerfuel GPU in it, someone else will need a crazy network card or data capture board, or whatever.

What the Arm transition allowed them to do was to eat into that niche market by satisfying a lot more people’s needs. But now satisfying the next 100,000 people would require a lot more work. And the 100,000 after that even more. So slots will solve the problem for a bunch of them, and, over time, Apple will address more and more groups in that niche audience if, in doing so, they also can leverage those solutions for their core audience. But they’re never going to make a GPU board using a high end chip they design that can only be sold to 10,000 people.
 

Jimmyjames

Site Champ
Posts
634
Reaction score
708
They haven’t really been ignoring them, current users of the Intel Mac Pro have gotten new GPU upgrade options as late as March 2022. It’s more of an acknowledgment that the transition is going to be slower for many professional applications than others which would be true regardless of hardware capabilities. And the hardware capabilities have come slower than expected.
From 2013 to 2019 the only options were the trash can Mac Pro (D300, D500, D700), or from 2017 the iMac Pro (Vega 56, 64). From 2019 the options have been the AMD 5000 and 6000 series.

Is that really an acceptable selection over ten years? To me, it’s pathetic.
 

Jimmyjames

Site Champ
Posts
634
Reaction score
708
They’re not “adding” a third party gpu. They are selling a machine with slots in it. We always knew that the Mac Pro would have such slots, otherwise it’s just a Mac Studio.
Ok but there seems to be little evidence of any software support for third party gpus. Where are these gpus coming from? Not nvidia for sure, so AMD?
 
Top Bottom
1 2