M3 core counts and performance

In summary, I think a separate 27" display + computer is a better solution than the iMac, but only if Apple makes that combo as cost-accessible to consumers as the 27" iMac, which they've not done
I agree - the Studio Display would be perfect at $1000, ordinary Apple-priced at $1200. And the latter would allow a base Mini + Studio Display to slot in at $1800, same as the old base 5K iMac.

I also agree there's lots of other discontent based on price, such as what happens when you choose to go above the base RAM. That was a major attraction of the 27" iMac for lots of people: you could buy a base model and max its RAM out far cheaper than Apple charged.
 
I agree - the Studio Display would be perfect at $1000, ordinary Apple-priced at $1200. And the latter would allow a base Mini + Studio Display to slot in at $1800, same as the old base 5K iMac.
...a less-fancy Retina display at $1,000 would work as well.
I also agree there's lots of other discontent based on price, such as what happens when you choose to go above the base RAM. That was a major attraction of the 27" iMac for lots of people: you could buy a base model and max its RAM out far cheaper than Apple charged.
I think that's part of the reason Apple really likes UMA. In addition to the performance benefits, it also allows them to justify not allowing upgradeable RAM in their desktops, hence increasing their profit margins. [If Apple Silicon retained a "conventional"* memory architecture, they'd have a big PR issue offering their desktops with soldered RAM only.] Thus it serves dual purposes.

*What they had under Intel, with separate DRAM for the CPU and discrete GPU.
 
Last edited:
I think that's part of the reason Apple really likes UMA. In addition to the performance benefits, it also allows them to justify not allowing upgradeable RAM in their desktops, hence increasing their profit margins
UMA absolutely does not dictate non-upgradable memory. The use of LPDDR memory module dictated that.
 
UMA absolutely does not dictate non-upgradable memory. The use of LPDDR memory module dictated that.
UMA with non-soldered memory is not terribly functional. The bandwidth/space/power considerations of trying to use DDR with Apple’s setup are considerable, even for a desktop. So as a point of practicality, UMA requires LPDDR/soldering.
 
Why not?

the memory controllers don’t care if the memory is soldered.
I had edited the post just as you posted to explain further. You *can* do it but it isn’t practical. Especially for smaller memory sizes, to achieve the bandwidth you’d have to have a large number of DDR sticks that don’t exist and even for Desktop you’re talking about having to fill in a huge number of slots which is a massive space and power crunch. For laptops and smaller devices that isn’t possible, for Studios it would be doable but even I'd argue impractical, only for the tower could you really do it, it'd be ugly but you could do it, and you would then have to make special additional hardware just for that device and well Apple doesn't really seem to be interested in the tower much.

I remember @leman worked out how many sticks it would take and I remember Hector doing the same but also measuring power costs. There’s a reason GPUs also use GDDR not just lots of DDR, for bandwidth considerations, DDR just isn't practical for GPUs and the UMA has to drive the Apple GPU.
 
Last edited:
Has anybody been able to discern any useful information (or glean anything form the released die shots) that might help us better speak to speculate on M3 Ultra ( 2* M3 max chips or the long rumored 4 * M3 Max chips)??
 
My take on this is a bit different. To me, the main benefit of the iMac is that it forced you to use a *great* display with your Mac. The display is a huge part of the experience of using a computer, and yet the amount of people who pair a $200 display with their >$2000 computer is mind-blowing to me.
I think this is a result of how products are reviewed, particularly for gaming or “enthusiast” products. The decision of which computer to purchase ends up being a matter of ticking a handful of boxes in a spreadsheet. The issue here is that the specifications that reviewers highlight are not a subset of the specifications that matter for a computer, but rather a subset of the specifications that matter *and can be easily measured*.

So, for displays you often see reviewers talking about size, resolution and refresh rate. Sometimes brightness is also mentioned (rarely), and color or color accuracy is almost never mentioned! For a display! So people end up buying the cheapest 4K 144Hz display on the market, where the darkest black is light gray, viewing angles are shit and brightness goes up to about half the brightness of an Apple display.

I think this became so common that the idea of buying a display priced at $700 (half the cost of a 5K UltraFine) would already be considered outrageously expensive, even by iMac buyers. Even more so for the actual price of the UltraFine 5K. I don’t think the display market has changed much on the right direction this past years, so the situation is the same as a decade ago in this regard. People are going to buy shitty displays for their Macs.


To be absolutely fair there aren’t any other true Retina displays in the market other than the Studio and the Pro Display XDR. Most popular size is 27” @4K which is a bad combination for macOS. In fact, there seem to be less 5K (or 4K in smaller size) displays now than in 2015. Seems like the market settled for lower DPI and none other than Apple is pushing higher res monitors.

Permit me to offer a different perspective. All of this discomfort (both here and at MR)—with Apple's RAM sizes, SSD sizes, and AIO's vs. separates—isn't about what Apple does and doesn't offer. Instead, it's fundmentally about cost. You can get both the 13" and 15" Airs with 24 GB/2TB, which is enough to cover most uses cases of that form factor (OK, maybe 32 GB would be nice). And you can get both the 14" and 16" MBPs with 128 GB/8TB. You just have to pay for it.

I'm not saying those cost concerns aren't legitimate. I'm saying we should recognize them for what they are.

[This is not necessarily the case with the Ultra Studio and MP—there you might actually need >192 GB RAM, or a more powerful GPU than what Apple offers; but if you want to stay in MacOS you don't have the option to get those.]

Likewise, I'd venture to say the overwhelming majority of those complaining about the lack of a 27" AS iMac aren't doing so because a Mini+ASD setup is unacceptable to them. [Indeed, I agree it's not great to bundle a display with a PC, since the former often has a longer life than the latter] Rather, it's about cost. With the iMac, if you wanted a big, gorgeous screen (which most people would appreciate), yet didn't have heavy computing needs (also the case for most), you could get a 27" entry-level iMac (starting at $1,800). You can do that with the Mini + ASD, but the starting cost is significantly greater ($600 + $1600 = $2200). And I do think that is a legimitate complaint.

I think the real problem is that MacOS renders text differently from Windows, such that a Retina display really is needed for it to look good. And I say that as someone who is looking at a 27" Retina (218 ppi) side-by-side with a 27" 4k (163 ppi) and 24" WUXGA (94 ppi). The differences are clear. Yet while Apple offers consumer-priced PC's, they don't offer consumer-priced Retina displays (something with a BOM and quality comparable to that in the 24" iMac). And no one else does either.

Asking a consumer who's spent, say, $800 on a 16GB/256GB Mini, or $1,200 on a 16GB/256GB M1 Air, to shell out $1,600 for an external display is a lot. And remember the number of consumers that buy machines in this price range is a lot more than the number that buy the higher-end devices, so this impacts a lot of their customers.

Thus Apple has created an OS that requires a Retina display to look its best, without providing a cost-reasonable (for Apple--I'm still thinking >=$800 ) way for consumer (as opposed to prosumer) purchasers to acquire a Retina external montior.

In summary, I think a separate 27" display + computer is a better solution than the iMac, but only if Apple makes that combo as cost-accessible to consumers as the 27" iMac, which they've not done

Anyway, that's my rant on this.
I did mention the economics of the iMac in my post if you wanted the high dpi display, but it's fair to say that I forgot about the OS making 4K 27" monitors not look so good - mostly because I have a 27" iMac ... from 2013. :) How bad is it btw? I actually am considering getting such a monitor when I upgrade to an M3:

Amazon product ASIN B09NF49CDZ
The builtin KVM so I could just switch to a Linux desktop at will is very attractive but if the experience of plugging in the Apple laptop/desktop to the monitor is really that bad, I might not bother.
 
Last edited:
Ladies and Gentlemen…commence your outrage!
1699397360992.png
 
Has anybody been able to discern any useful information (or glean anything form the released die shots) that might help us better speak to speculate on M3 Ultra ( 2* M3 max chips or the long rumored 4 * M3 Max chips)??
No clue yet. I think the Maxes won't reach customers until late November so we'll probably find out then or shortly after. The rumors are currently unkind to the prospect of 4 M3s, but we'll see.
 
I did mention the economics of the iMac in my post if you wanted the high dpi display, but it's fair to say that I forgot about the OS making 4K 27" monitors not look so good - mostly because I have a 27" iMac ... from 2013. :) How bad is it btw? I actually am considering getting such a monitor when I upgrade to an M3:

Amazon product ASIN B09NF49CDZ
The builtin KVM so I could just switch to a Linux desktop at will is very attractive but if the experience of plugging in the Apple laptop/desktop to the monitor is really that bad, I might not bother.
Depends how sensitive you are. I was happy with 4k@27" (163 ppi) through High Sierra, because it included subpixel anti-aliasing, which effectively increased horizontal text resolution. When I upgraded to Mojave, I didn't care for how it displayed text on my 4k, and later determined that was the reason. Thus I realized if I were going to continue to stay current with MacOS, I'd need a 27" 5k, which is why I bought my 2019 iMac.

I know, however, that others are fine with 4k@27" on current versions of MacOS. So it depends on you,
 
I don’t want to honor it with a view lol
Is the complaint just the usual “it gets to 108c before the fans ramp up”, or is there more to it?
The Air is gonna be bad bad because it’s hot I think.

As far as I can tell, they find something outrage-worthy, then work their way backwards. So they start the video by saying “I’m really worried the M3 is gonna get really hot. Let’s see”.
 
UMA absolutely does not dictate non-upgradable memory. The use of LPDDR memory module dictated that.
Why not?

the memory controllers don’t care if the memory is soldered.
OK, thanks for the correction. I'll add a strikeout to my post. I had thought UMA needed to be slotted, because it's used for both the CPU and GPU, and I'd never heard of slotted GPU RAM. But it sounds like that's for other reasons.
 
OK, thanks for the correction. I'll add a strikeout to my post. I had thought UMA needed to be slotted, because it's used for both the CPU and GPU, and I'd never heard of slotted GPU RAM. But it sounds like that's for other reasons.
The reason you've never heard of it is because it doesn't exist. @Cmaier and @quarkysg are technically correct but the non-soldered solutions are impractical. It's why why GDDR exists and why Apple went with LPDDR for its solution. For instance the minimum DDR5 module is 8GB but that doesn't get you the max per module bandwidth of 51GB/s. You need the 16GB module for that. That means the base M3 would require 2 16GB sticks for 32 GB of minimum RAM while the Pro SOC would require 3 such sticks for a minimum of 48GB of RAM. The Max SOC would require 8 such sticks for 128GB of minimum RAM. The Ultra would require 16 sticks for 256GB of *minimum* RAM. For some of these the minimum RAM required to get the necessary bandwidth is greater than the maximum RAM offered and for the final two is equal to it. I know we all want Apple to raise minimum RAM requirement but uhhh ... hmmm ... and that's not even considering all the space/power requirements of all those sticks which for some of the devices these SOC are destined for is a complete non-starter and for others at the very least a huge compromise.

EDIT: The only way I can think of to get this to work (for desktop only) would be for Apple to commission custom, compact DDR5 modules with lower per module memory but the same bandwidth and I don't even know if that is feasible. As I said, 8GB DDR5 exists but is single channel only. Again, it would also only work for the desktop chips, splitting the packaging of desktop and laptop SOCs. It also wouldn't solve the cost issue really since Apple would be one of the only providers for a niche market, which, as we've been discussing for displays, isn't a good way to get costs down. Bottom line: even with this idea, which may not even be technically feasible, DDR5 is just not practical as a solution. Soldered LPDDR5 modules are just that much better.
 
Last edited:
Thus Apple has created an OS that requires a Retina display to look its best, without providing a cost-reasonable (for Apple--I'm still thinking >=$800 ) way for consumer (as opposed to prosumer) purchasers to acquire a Retina external montior.
This is true but c'mon, the first Retina Mac was released in 2012. The last non-Retina Mac was the 2017 MacBook Air. It's just maddening sometimes how far behind the tech world lags in some things. I can't believe there's still so few high-DPI displays out there.

I agree - the Studio Display would be perfect at $1000, ordinary Apple-priced at $1200. And the latter would allow a base Mini + Studio Display to slot in at $1800, same as the old base 5K iMac.
Similar displays are not that much cheaper. The 27" 5K UltraFine is $1200 already, with much worse build quality. Samsung's suspiciously-Studio-Display-looking ViewFinity S9 is $1600.
I think these panels are just expensive, maybe due to being unusual configurations outside of the Apple world. The last Thunderbolt Display was $999 and the build quality was similar to the Studio Display, there's gotta be a reason for the huge price bump.
 
A couple of interesting notes:

The cpu never got above 3.74ghz. Perhaps 4ghz is reserved for the Max?

During the Cinebench run, they state that it draws the same power as the M2 (20 watts). What does this say about Geekeewan’s 14 watts figure in the A17 review?
Oh man, now I want to watch it to pick it apart. I must resist 🧘‍♂️

My gut feeling is the testing is wrong and/or they’re drawing a conclusion with insuffcient evidence. There’s no way it isn’t hitting the same ST clocks as M3 Pro and Max - there’s no difference in ST performance. Plus, I find it hard to believe the cooling solution could be so bad that it can’t maintain peak ST clocks.
Maybe he’s deliberately feigning ignorance about clocks - e.g. expecting it to run 4.05GHz on all cores despite every prior M SoC having two clock states for ST and MT.. Or maybe that weird-ass looking Intel Power Gadget clone isn’t reporting accurately.

Anyway, I’m just speculating on a video I haven’t watched now so I should shut up 😂
 
Back
Top