Mac Pro - no expandable memory per Gurman

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,335
Reaction score
8,526
We actually have pretty good reason to believe that Jade 4C was never real, no matter how many anonymous forum posters swore they totally saw it. At best, it was a "canary trap" to find leakers. Reverse engineering, decapping, and so forth performed on Jade family SoCs (T6000 aka M1 Pro, T6001 aka Max, and T6002 aka Ultra) have found absolutely no evidence of a 4-die config which failed to make it to market. The die-to-die interconnect supports only 2 die through a passive interposer, and the interrupt controller only supports 2 die even though its register interface is designed to cleanly extend to more die in future SoC generations.

Sometimes you have to put the rumors aside and pay more attention to observed reality.

Yes, it was immediately clear just from looking at it that you couldn’t do 4C with the M1, which is something I explained all the way back then. But I don’t think it was a leak trap - I think it was something they intended for M2 (which seems to have run into a problem)
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
Yes, it was immediately clear just from looking at it that you couldn’t do 4C with the M1, which is something I explained all the way back then. But I don’t think it was a leak trap - I think it was something they intended for M2 (which seems to have run into a problem)
I remember you saying that and believed you at the time. I still do. I figured there would be resistance when I conjured its ghost, but decided to do it anyway. The leaker I posted about specifically stated the number of CPU/GPU cores in the original Mac Pro prototype would have been equal to a quad M1 Max; what would have been expected from an M1 "Extreme". Later, when they had access to a new second prototype, they claimed it uses what would be expected of an M2 Ultra, which are the specs they currently stand by. I don't have an answer for that discrepancy, I'm just pointing out that it exists, and can only say that this leaker has a proven track record, better than Gurman or Kuo. Something is missing in the equation, I don't know what that would be, so make of it what you will. Regardless, it has no particular impact on expected 2023 products.
 

dada_dave

Elite Member
Posts
2,166
Reaction score
2,154
The specs from the MR leaker matched Gurman's Jade 4C-Die:

Then the next, updated prototype, which is what he currently stands by, matches the M2 Ultra:

Double that and you get the M2 "Extreme".

Gurman said such a beast existed, back when he had reliable sources. The leaker on MR said that the Jade 4C-Die was tested with a 6900XT inside it, and it didn't work, so they claimed it was a physical product. This is the same person who leaked the specs, case, and name of the Mac Studio before Apple announced it. I don't know what solution Apple may or may not have used for an M1 "Extreme", just that there were credible claims that Apple was working on something in that space. That doesn't mean it took the same form that the alleged M2 "Extreme" did. However, it would help inform us as to why it is taking Apple so long to conjure up a suitable Mac Pro.

Regardless, whether an M1 "Extreme" existed or not doesn't matter, not now. So, I don't want to get too far off into the weeds on something that will never see the light of day, whether it is made of unicorn dust or not. What matters is what is cooking inside Apple's labs for the future, and many of us, particularly desktop users, are concerned about said future.

Right, but I doubt in this instance that the leaker was correct about the specs or that it was an M1-based chip, which he doesn't specify. It matters a little since you brought it up and it determines if they failed once or twice to bring forth an extreme level chip. Of course we don't even know that they've failed this time. Kuo/Gurman could be wrong - for instance, this leaker still thinks a monster chip is coming. Gurman predicted a Jade 4C die made up of 4 M1 Maxes, which was clearly wrong regardless of what that leaker had access to since it simply couldn't be 4 M1 Maxes. So the simplest and most likely conclusion is that his prediction simply didn't hold rather than Apple had to take two bites at the Extreme chip (or three if they've failed this generation). As you said we should move on.

Which part do you disagree with?

In the round table meeting Apple had with the press, they did say they had created a pro-workflow group. There has been 3 strategies in 4 years: 2019 - tower with amd gpus. 2020 - soc only. Now- soc and third party gpu if you need power??

Again, what do you disagree with here?

This part

Now- soc and third party gpu if you need power??

is more like this (if this rumor is even true):

Now- soc and third party gpu on the Mac Pro line if you need compatibility with current software that requires AMD drivers or virtualize Linux/Windows or run Asahi Linux and need a 3rd party GPU, like CUDA, for your work

A lot more specific, nothing to with the overall strategy beyond giving professionals capabilities they might like to have. Basically you're linking two things together that aren't necessarily related. The decision to add support for 3rd party GPUs would've been made long before Apple discovered the M2 Extreme wasn't going to work - if that rumor is indeed true. Current Apple chips are not capable of communicating with 3rd party GPUs - adding that in would have to be a conscious decision for higher level M2 chips (well it's more complicated than this, there are software limitations, but also hardware limitations for certain circumstances which may or may not apply to macOS/Windows but definitely would for Linux if they want those GPUs to work on metal/virtualized environments). Basically if this was something they had planned to allow it wasn't because they were disappointed with their own in-house GPU performance. So again the failure of the M2 extreme (if it has failed) would be the disappointment and adding support for 3rd party GPUs in the Mac Pro line doesn't signal any shift in strategy.

Edit: Think of it more akin to adding Rosetta - that doesn't signal that Apple thinks their ARM chips are bad and no one will write software for them, just that they are giving time for software to transition. It's a little more than that obviously since this also could help people working in Windows/Linux environments. In fact those are great examples too, allowing different virtual environments or quietly supporting the Asahi Linux project doesn't signal a shift in strategy of supporting their own OS. It just gives professionals the capabilities they would otherwise miss without those tools. Again, this is if Apple has indeed made the necessary changes for 3rd party GPUs to work on macOS or through Windows/Linux on Macs, none of which we actually know to be the case, this is just Gurman. Notably the leaker who claims to have had hands on time (or someone he knows did) with the machine says that 3rd party GPUs do not work on the prototype. Reportedly it recognized the name of the device, but obviously without drivers and potentially without the necessary hardware changes ... of course that's a prototype, and maybe it's just a software change away from working ... or not ...
 
Last edited:

Colstan

Site Champ
Posts
822
Reaction score
1,124
It matters a little since you brought it up and it determines if they failed once or twice to bring forth an extreme level chip. Of course we don't even know that they've failed this time.
Touché!

Unlike Gurman, that same leaker still thinks that an "Extreme" is coming, even though he doesn't have precise knowledge of one:
as my pal have told me that their team have to launch MONSTER of CHIP for next mac pro for sake of making large step ahead of the others in Workstation market, so i think only 24 core CPU seem not enough (7000 amd threadripper and intel sapphire rapid is on the way)

I fully believed that we will see at lease 4*M2 max chip in 8,1 mac pro.

Again, I've always prefaced everything I have said with a rocket ship full of salt, and I have presented what I perceive to be the worst case scenario, and very much hope that I am wrong. Perhaps I'm simply being pessimistic because of Apple's silence. Gurman isn't helping with his "in the coming months" routine, followed up by "hallmark features" like expandable memory that he now says isn't actually expandable.

So again the failure of the M2 extreme (if it has failed) would be the disappointment and adding support for 3rd party GPUs in the Mac Pro line doesn't signal any shift in strategy.
This is where you and I simply have a difference of philosophy. In my mind, if Apple were to release a Mac Pro with third-party GPU support, even in an optional form, I think it's saying to their users, "Hey, our Apple Silicon graphics are great, but if you want the best, even better than ours, purchase the new Parhelia III from Matrox, exclusively for the Mac Pro today!" The peasants using regular Apple Silicon get fast, but still integrated GPUs, while only the premium customers get access to the good stuff.

That's fine on a technical level, but I wouldn't envy the marketing department having to answer some uncomfortable questions. If I were interviewing Craig and his fabulous hair after the release of such a Mac Pro, that would be my first question: "Why can't Apple Silicon compete with the new Parhelia?".

Marketing aside, isn't there a technical issue with this scenario? Graphics drivers are non-trivial, just ask Intel. Before Apple releases third-party GPUs inside the Mac Pro, I would think they'd want to send out test units inside of eGPUs first to get feedback. Also, this is way, way out of my field of expertise, but from what I understand, unlike IOKit, DriverKit has no abstraction for display GPU cards. What I'm saying is that we just haven't seen, well, anything to support the notion that Apple plans to go beyond what they already offer in Apple Silicon, outside of Gurman's latest whims?

Speaking of which, are we going to get another article in three weeks stating that third-party graphics are not supported in the Mac Pro? He's already got the clicks over DIMMs, twice now. The Mac Pro supporting say, the RX 7000-series, would be a big scoop, but it seems like another "hallmark feature" that he halfheartedly tossed in with no further explanation.

If nothing else, at least we have Gurman to keep us guessing during the cold winter months. Regardless, I want to be clear that, like everyone else here, I want Apple to succeed with the M-series. I've just become more reserved with my expectations as time has marched on and new Macs seem to be forever on the horizon, but not quite within reach. I'm hoping for some clarity, perhaps by March, and hopefully we won't have to wait until WWDC for the complete Apple Silicon picture.
 

dada_dave

Elite Member
Posts
2,166
Reaction score
2,154
Touché!

Unlike Gurman, that same leaker still thinks that an "Extreme" is coming, even though he doesn't have precise knowledge of one:


Again, I've always prefaced everything I have said with a rocket ship full of salt, and I have presented what I perceive to be the worst case scenario, and very much hope that I am wrong. Perhaps I'm simply being pessimistic because of Apple's silence. Gurman isn't helping with his "in the coming months" routine, followed up by "hallmark features" like expandable memory that he now says isn't actually expandable.

Yeah I just don’t like seeing people get despondent over this stuff. It’s definitely aggravating that we don’t have the full picture yet, but this kind of rumor mill can drive you crazy if you keep getting jerked around by it.

This is where you and I simply have a difference of philosophy. In my mind, if Apple were to release a Mac Pro with third-party GPU support, even in an optional form, I think it's saying to their users, "Hey, our Apple Silicon graphics are great, but if you want the best, even better than ours, purchase the new Parhelia III from Matrox, exclusively for the Mac Pro today!" The peasants using regular Apple Silicon get fast, but still integrated GPUs, while only the premium customers get access to the good stuff.

That's fine on a technical level, but I wouldn't envy the marketing department having to answer some uncomfortable questions. If I were interviewing Craig and his fabulous hair after the release of such a Mac Pro, that would be my first question: "Why can't Apple Silicon compete with the new Parhelia?".
Easy: It’s not that it’s more powerful but some of our customers use programs that rely on AMD drivers or CUDA and this gives those professionals access to the tools they need. Naturally we will be working with our valued partners to ensure that our even more valued customers get the best, fully native experience, but we aren’t going to cut off our customers from their work in the meantime.

You guys are getting hung up on whether the discrete GPUs would be more “powerful” rather than would they offer support for programs that Apple Silicon simply doesn’t (yet) because those programs are specifically written with AMD or Nvidia GPUs in mind.

Marketing aside, isn't there a technical issue with this scenario? Graphics drivers are non-trivial, just ask Intel. Before Apple releases third-party GPUs inside the Mac Pro, I would think they'd want to send out test units inside of eGPUs first to get feedback. Also, this is way, way out of my field of expertise, but from what I understand, unlike IOKit, DriverKit has no abstraction for display GPU cards. What I'm saying is that we just haven't seen, well, anything to support the notion that Apple plans to go beyond what they already offer in Apple Silicon, outside of Gurman's latest whims?

Speaking of which, are we going to get another article in three weeks stating that third-party graphics are not supported in the Mac Pro? He's already got the clicks over DIMMs, twice now. The Mac Pro supporting say, the RX 7000-series, would be a big scoop, but it seems like another "hallmark feature" that he halfheartedly tossed in with no further explanation.

It would not surprise me in the least. I’ll start off on the positive: AMD is well versed in writing Mac drivers for their GPUs and have continued to do so for their 6000 GPUs and Intel Macs, including this year. So the issue isn’t what @Jimmyjames contends that Apple has ignored AMD, nor would AMD being writing a driver completely from scratch. The issue is that AMD would have to write 7000 series drivers and support them (though Apple is known to help with both writing and support) for a fraction of an already small product segment. Not the most attractive proposition for either Apple or AMD. In this scenario Apple would be officially shipping drivers, but no DriverKit is not currently built to handle GPU drivers.

Then there’s would it work under virtualization or metal where there are already drivers. So problem solved? Well .. no. Is Windows officially allowing people to buy keys yet for ARM anyway? And Apple would have to change their virtualization layer to allow for PCIe pass through. For Linux regardless of virtual or metal, Apple would have to change their silicon to allow normal memory mapping over PCIe bar instead of treating it as device memory (CUDA might work regardless). This is why I said to @Jimmyjames, there are so many moving parts here if this happening it has been in the works long before any of the rumored cancellations of the M2 Extreme happened.

All this suggests a lot of work and potentially business deals behind the scenes, so is it actually happening? I dunno 🤷‍♂️. It seems like a lot of work to serve a small customer base. Unlike you guys I think that would be nice if Apple put all that effort in to support their customers but if this would all turn out to be bullshit I wouldn’t exactly be surprised.

If nothing else, at least we have Gurman to keep us guessing during the cold winter months. Regardless, I want to be clear that, like everyone else here, I want Apple to succeed with the M-series. I've just become more reserved with my expectations as time has marched on and new Macs seem to be forever on the horizon, but not quite within reach. I'm hoping for some clarity, perhaps by March, and hopefully we won't have to wait until WWDC for the complete Apple Silicon picture.
I agree. The timeline has been stretched very very badly. And that is very annoying. Having said that everything is not wine and roses on the other side of the fence. Nvidia and AMD are both getting roasted for their overly hot GPUs with poor pricing. People are pissed. And while the CPU side AMD and Intel got these generations out, it’s important to note that both are playing catch up in fabrication to Apple and the future roadmap for both starts to look more uncertain. When will AMD get to use TSMC’s 3nm? Will Intel actually be able to execute on it’s ambitious roadmap for either its architecture or its fabrication this time?

My sense is: expect more delays, expect rising costs, expect more power hungry designs and design mistakes, and expect them industry wide. Buckle up because the road gets worse from here for everyone. Chiplets and novel packaging solutions will help but they won’t solve everything. The only positive for Apple is that their more efficient designs give them an edge in this environment, but not an insurmountable one if they suffer more delays. I suppose that’s my sunny take of the day.
 
Last edited:

Colstan

Site Champ
Posts
822
Reaction score
1,124
Yeah I just don’t like seeing people get despondent over this stuff. It’s definitely aggravating that we don’t have the full picture yet, but this kind of rumor mill can drive you crazy if you keep getting jerked around by it.
I think it's a result of Gurman losing his best sources. Instead of someone with direct knowledge of Apple's plans, he's getting it through a game of telephone. Yet, we keep falling for it, because he's all we've got right now.
You guys are getting hung up on whether the discrete GPUs would be more “powerful” rather than would they offer support for programs that Apple Silicon simply doesn’t (yet) because those programs are specifically written with AMD or Nvidia GPUs in mind.
Again, we'll just have to agree to disagree on this particular point, which is fine. Life would be awfully boring if everyone agreed on everything and thought the same way.
Unlike you guys I think that would be nice if Apple put all that effort in to support their customers but if this would all turn out to be bullshit I wouldn’t exactly be surprised.
It's not about what I think Apple should do, it's what I think they will do. I think that Apple has spent the last decade trying to figure out what to do with the Mac Pro, as a product, and is still struggling with it. The will they/won't they with the "Extreme" chip is the worst romantic comedy that has ever been produced.
The timeline has been stretched very very badly. And that is very annoying. Having said that everything is not wine and roses on the other side of the fence.
I know. Which is partially why I'm so salty; it's not just with Apple and their apparent product delays. @Cmaier may be right that x86 PCs are not the future, but for today, they are the Mac's competitors. They're screwing up in their own ways. Both Nvidia and AMD are getting roasted for hot, overpriced GPUs. AMD's Zen 4 platform is too expensive, thanks to motherboard costs and DDR5 prices. Intel is a basket case with Arc, Sapphire Rapids, and Meteor Lake looks to be a sore spot. At least they seem to have a somewhat decent release with Raptor Lake, although I question the need for all of those E-cores.

Generally speaking, I'm not happy with anyone in the industry right now. However, the difference between Apple and the PC crowd, at the moment, is that the PC guys are shipping. No matter how good or bad the higher-end M2 series are, I think they should be in product right now, taking on their imperfect rivals. If we don't see something new and interesting after WWDC, then it's time for panic, but for the time being, I'm going to try to keep my expectations under control.
 

leman

Site Champ
Posts
642
Reaction score
1,196
I suppose we'll have to wait and see. The Mac Pro Gurman describes definitely sounds underwhelming. The GPU situation depends on whether Apple has a new, more desktop-oriented design. There is no technical reason why they wouldn't be able to compete with big boys if they want to. That's the question of policy, not ability.
 

B01L

SlackMaster
Posts
176
Reaction score
132
Location
Diagonally parked in a parallel universe...
I don’t see that leaker mention an M1, that was an M2 prototype by the looks of it? Gurman was just wrong the M1 Extreme as Hector Martin detailed on several occasions - the M1 Max was never designed to go above 2 tiles. It was logically impossible for it to do so without further changes.

Initial post was mid-July of 2022 with three or four or so mentions of a M1-family based single PCIe slot prototype, then we jump forward to October of 2022 and about 18 or so pages deep for the first mention of a M2 Ultra-based six PCIe slot prototype housed in the 3D Cheesegrater 2.0 chassis...

If we don't see something new and interesting after WWDC, then it's time for panic, but for the time being, I'm going to try to keep my expectations under control.

MacRumors "official" Apple roadmap has slated first quarter of 2023 for M2 Pro & M2 Max 14"/16" MacBook Pro laptops, M2 & M2 Pro Mac mini headless desktops, and M2 Ultra Mac Pro headless desktops... They also say we could get another 27" Apple display, but with rumored miniLED & 120Hz ProMotion it will be more expensive than the $1599 Apple Studio Display...

I would think preview of the 2023 ASi Mac Pro, with actual shipping units no later than WWDC 2023...?
 

leman

Site Champ
Posts
642
Reaction score
1,196
MacRumors "official" Apple roadmap has slated first quarter of 2023 for M2 Pro & M2 Max 14"/16" MacBook Pro laptops, M2 & M2 Pro Mac mini headless desktops, and M2 Ultra Mac Pro headless desktops... They also say we could get another 27" Apple display, but with rumored miniLED & 120Hz ProMotion it will be more expensive than the $1599 Apple Studio Display...

Don't pay any attention to that stuff, that's as arbitrary as any random forum post. These MR guides have zero factual information behind them.
 

B01L

SlackMaster
Posts
176
Reaction score
132
Location
Diagonally parked in a parallel universe...
Don't pay any attention to that stuff, that's as arbitrary as any random forum post. These MR guides have zero factual information behind them.

LOL, come on bro, everyone with half a brain knows that MacRumors is lousy with "Everything We Know" articles regurgitating the same trash week after week... ;^p
 
Last edited:

B01L

SlackMaster
Posts
176
Reaction score
132
Location
Diagonally parked in a parallel universe...
I just want to see a "high-end" ASi Mac mini with a Mn Pro SoC and I want to see what the ASi Mac Pro has to offer...

Once we get the debut & release of the first generation ASi Mac Pro behind us, we can start speculating on what the second generation ASi Mac Pro will bring us...!

N3X M3 Extreme SoC with LPDDR5X SDRAM and hardware ray-tracing...?!?
 

theorist9

Site Champ
Posts
613
Reaction score
563
Assuming the Mac Pro doesn't have expandable RAM, and features an Ultra chip:

The Ultra has 16 RAM modules, and the max module size we've seen for the M2 is 12 GB (LPDDR5) => 192 GB max RAM. Are higher module densities available with LPDDR5x or DDR5 and, if so, is there any chance they might modify the M2 chip in the upcoming Mac Pro to accept them, or would that require too much redesign?
 

Citysnaps

Elite Member
Staff Member
Site Donor
Posts
3,695
Reaction score
8,995
Main Camera
iPhone
Assuming the Mac Pro doesn't have expandable RAM, and features an Ultra chip:

The Ultra has 16 RAM modules, and the max module size we've seen for the M2 is 12 GB (LPDDR5) => 192 GB max RAM. Are higher module densities available with LPDDR5x or DDR5 and, if so, is there any chance they might modify the M2 chip in the upcoming Mac Pro to accept them, or would that require too much redesign?
I'm left wondering how much RAM would be enough for the majority of potential MacPro purchasers. And what their applications are that require, say, 192 GB of memory, or more. I might be able to come up with one or two, but they'd definitely be off the beaten path.
 

mr_roboto

Site Champ
Posts
288
Reaction score
464
In the round table meeting Apple had with the press, they did say they had created a pro-workflow group. There has been 3 strategies in 4 years: 2019 - tower with amd gpus. 2020 - soc only. Now- soc and third party gpu if you need power??
I look at the same reality and see only one strategy since the big press event about the future of Mac Pro: Apple wants to sell a compact non-expandable desktop to Mac workstation customers with lesser needs, and an expandable deskside/rackmount machine to those with greater needs.

You're seemingly obsessed with tactics - how does Apple provide CPU and GPU performance. But most customers don't care, they just want the capability. If Apple decides to continue AMD GPU support for a few more years, there won't be much whiplash. Devs already have to worry about AMD GPU support, since today's x86 Mac Pro userbase needs it.

None of this contradicts the overall plan for the future of workstation Macs laid out in that big press event. One of my takeaways was that Apple realized they'd let customers down, and was promising to be be more pragmatic when things didn't go to plan. The iMac Pro was their first step in that direction - they explicitly said it was created to get a more modern alternative to the 2013 Mac Pro out the door as soon as possible.

So if these rumors are real, Apple has suffered a rare miss on a major chip tapeout, and AMD GPUs are the practical way for Apple to put something reasonably good on the market soon rather than waiting another year or two for the better SoC solution to be ready. That's OK. Not ideal, but OK. Don't really see what you're so upset about.
 

theorist9

Site Champ
Posts
613
Reaction score
563
I'm left wondering how much RAM would be enough for the majority of potential MacPro purchasers. And what their applications are that require, say, 192 GB of memory, or more. I might be able to come up with one or two, but they'd definitely be off the beaten path.
If you're running, say, 20 separate extended calculations simultaneously (which you might want to do if you've got 20 cores), where each requires 30 GB RAM, then that's 600 GB RAM total. Or you could be doing a calculation on a single very large matrix several hundred GB in size.
 
Last edited:

B01L

SlackMaster
Posts
176
Reaction score
132
Location
Diagonally parked in a parallel universe...
The M2 MacBook Air has the "old" RAM limits, 8GB & 16GB, but also adds a higher maximum RAM, 24GB...

So maybe the higher end M2 SoCs do the same...?
  • M2 Pro = 16GB / 32GB / 48GB
  • M2 Max = 32GB / 64GB / 96GB
  • M2 Ultra = 64GB / 128GB / 192GB
  • M2 Extreme = 128GB / 256GB / 384GB
So basically the offerings are Baseline, Baseline x2, or Baseline x3...
 

Jimmyjames

Site Champ
Posts
676
Reaction score
764
Don't really see what you're so upset about.
I’m not sure upset is the correct word. In the grand scheme of things, a computer gpu is very unimportant. Having said that, I greatly prefer macOS and desktops to other platforms, and in recent years I have found Mac desktops to be pretty uninspired and what’s worse I can’t recall a second version of any of them. Trash can- one version. iMac Pro - one version. Studio - one so far and rumors there won’t be a second. 2019 Mac Pro - maybe a second edition, but nothing yet, over three years later.

How is this acceptable for any pro desktop user?

You say I’m obsessed with tactics and strategy, but I’m merely trying to find a strategy. Most pros would have been happy with a tower and replaceable components like pc manufacturers make. it feels like Apple hates that approach because it makes it easy to compare the cost with pcs. Instead they want custom components as a means of increasing margins. I’m fine with that. If the Mac Pro with their soc is the future of high end macs, I’d be happy. Only they seemingly can’t iterate successfully with this approach. They were unable to complete one full cycle. So we’re left with a situation where they don’t want to make a computer that pros prefer - tower with replaceable components, and they can’t iterate on their preferred approach.

It should be concerning for any pro.
 

theorist9

Site Champ
Posts
613
Reaction score
563
It's been mentioned that one can't expect Apple to put much into the Mac Pro because its sales volume is likely to be small. But I suspect the Mac Pro, exceptionally, isn't just about profits. Apple may consider it important, for the continued health of the Apple ecosystem, that the kinds of users that would buy Mac Pros remain within it.

Plus, as a halo product, its existence serves a marketing function for non-Mac Pro buyers—recall the several Mac Pro's featured prominently in the background during Johny Sirouji's presentations about Apple Silicon. In this latter sense Apple could take a small loss on the Mac Pro and consider it part of its marketing budget (not an exact analogy, but consider the marketing effect of Honda's participation in Formula 1).

Finally, there's the intangible benefit of the Mac Pro for engineer morale within Apple itself. If I were a hardware engineer at Apple, I suspect I'd like it that my company continues to produce a product like the Mac Pro (akin to the effect of Honda's Formula 1 efforts on the Honda engineers working on Accords).
 
Last edited:

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,335
Reaction score
8,526
It's been mentioned that one can't expect Apple to put much into the Mac Pro because its sales volume is likely to be small. But I suspect the Mac Pro, exceptionally, isn't just about profits. Apple may consider it important, for the continued health of the Apple ecosystem, that the kinds of users that would buy Mac Pros remain within it.

Plus, as a halo product, its existence serves a marketing function for non-Mac Pro buyers—recall the several Mac Pro's featured prominently in the background during Johny Sirouji's presentations about Apple Silicon. In this latter sense Apple could take a small loss on the Mac Pro and consider it part of its marketing budget (not an exact analogy, but consider the marketing effect of Honda's participation in Formula 1).

Finally, there's the intangible benefit of the Mac Pro to engineer morale within Apple itself. If I were a hardware engineer at Apple, I suspect I'd like it that my company continues to produce a product like the Mac Pro (akin to the effect of Honda's Formula 1 efforts on the Honda engineers working on Accords).
All true. The question is how much are they willing to spend on that?

That’s why I think that whatever the solution is, they will seek to leverage across their platforms. Hence my theory that at some point they may partition things differently, so that the GPU die are separate from the CPU die, etc.
 
Top Bottom
1 2