M3 core counts and performance

Oh man, now I want to watch it to pick it apart. I must resist 🧘‍♂️

My gut feeling is the testing is wrong and/or they’re drawing a conclusion with insuffcient evidence. There’s no way it isn’t hitting the same ST clocks as M3 Pro and Max - there’s no difference in ST performance. Plus, I find it hard to believe the cooling solution could be so bad that it can’t maintain peak ST clocks.
Maybe he’s deliberately feigning ignorance about clocks - e.g. expecting it to run 4.05GHz on all cores despite every prior M SoC having two clock states for ST and MT.. Or maybe that weird-ass looking Intel Power Gadget clone isn’t reporting accurately.

Anyway, I’m just speculating on a video I haven’t watched now so I should shut up 😂
Yeah, I think they haven't tested anything other than Cinebench. Perhaps that’s why it didn’t hit 4ghz?
 
This is true but c'mon, the first Retina Mac was released in 2012. The last non-Retina Mac was the 2017 MacBook Air. It's just maddening sometimes how far behind the tech world lags in some things. I can't believe there's still so few high-DPI displays out there.


Similar displays are not that much cheaper. The 27" 5K UltraFine is $1200 already, with much worse build quality. Samsung's suspiciously-Studio-Display-looking ViewFinity S9 is $1600.
I think these panels are just expensive, maybe due to being unusual configurations outside of the Apple world. The last Thunderbolt Display was $999 and the build quality was similar to the Studio Display, there's gotta be a reason for the huge price bump.
5K displays are even harder than 4K to drive, sure GPUs have gotten more powerful so that shouldn’t be as a big of a deal but for a lot of office work computers with small iGPUs it might make a difference still. On top of that standard techniques which I guess Apple refuses to use can make text look great at 4K. Sure it would look even better at 5K, but you’re fighting against good enough for a display. I know 8K TV screens are a thing now so maybe higher resolution displays will eventually make their way to the monitor world but unlike for movies and mobile it just isn’t seen as a much of a necessity. 4K seems to have been where the current sweet spot settled wether by accident of history or technical reasons.
 
Oh man, now I want to watch it to pick it apart. I must resist 🧘‍♂️
Yeah, it is mostly blather. I watched it with captions, muted, probably the best plan for those guys – for a lot of reviewers, really. And the in-video ad was heard best not-heard.
My gut feeling is the testing is wrong and/or they’re drawing a conclusion with insuffcient evidence. There’s no way it isn’t hitting the same ST clocks as M3 Pro and Max - there’s no difference in ST performance.

One of his points was the 14" has a single, smaller fan rather than the 2 larger ones from before, and the heatsink is smaller. He ran "our stress test",whatever that is, but I imagine it loads up the processor so all the cores are running. AIUI, M3 will not hit 4.0 if more than one P-core is running, so no surprise it showed slower.
 
You guys know you can get a "retina quality" display (in terms of resolution), or close to it in terms of screen resolution in a 27" 4k IPS monitor for like... 300 bucks, right?
 
You guys know you can get a "retina quality" display (in terms of resolution), or close to it in terms of screen resolution in a 27" 4k IPS monitor for like... 300 bucks, right?
The problem is according to Apple that isn’t “retina” at that size - it has to be 5K at 27”. And apparently text at 4K looks fuzzy coming out of macOS. I don’t have one myself, but I’ve seen this reported multiple times including in this thread. Some people aren’t bothered by it, others are.

Btw I agree with you that 4K should be more than fine and Apple should treat 4K 27” in macOS as a first class viewing standard, but apparently they don’t.
 
You guys know you can get a "retina quality" display (in terms of resolution), or close to it in terms of screen resolution in a 27" 4k IPS monitor for like... 300 bucks, right?
I'm running my M1 Mini connected to a LG 32" 4K panel, scaled to 1440p effective resolution. Works fine for me.

For 4K 27" panel, to go "retina" you have to set it to scale to 1080p effective resolution, but all elements will appear big. Full 4K effective resolution will be too small to see clearly. So for me, the only effective scaled resolution is 1440p for a 4K panel when connected to recent Macs.
 
I'm running my M1 Mini connected to a LG 32" 4K panel, scaled to 1440p effective resolution. Works fine for me.

For 4K 27" panel, to go "retina" you have to set it to scale to 1080p effective resolution, but all elements will appear big. Full 4K effective resolution will be too small to see clearly. So for me, the only effective scaled resolution is 1440p for a 4K panel when connected to recent Macs.
Good to know that it looks good to you! I’ve seen people complain. I’m interested in trying myself but definitely curious about others opinions.
 
Last edited:
On top of that standard techniques which I guess Apple refuses to use can make text look great at 4K. Sure it would look even better at 5K, but you’re fighting against good enough for a display. I know 8K TV screens are a thing now so maybe higher resolution displays will eventually make their way to the monitor world but unlike for movies and mobile it just isn’t seen as a much of a necessity. 4K seems to have been where the current sweet spot settled wether by accident of history or technical reasons.
My point was that Apple did support subpixel aliasing for font rendering for a looooong time. It was pretty much required for old displays. They stopped doing it on macOS Mojave (2018). Why? A couple reasons come to mind:
- Apple is evil (the favorite option at the other place).
- The feature has been around for so long that the code implementing it was old, refactoring that code for what is essentially a ”legacy” feature couldn’t be justified, and engineers wanted to get rid of it.
- Getting rid of it helped achieve some other goal, like greatly simplifying existing code paths and speeding up development of other related features.

A quick Google search seems to indicate that the winner is option 3. From Hacker News:
ex-MacOS SWE here. Subpixel antialiasing is obnoxious to implement. It requires threading physical pixel geometry up through multiple graphics layers, geometry which is screen-dependent (think multi-monitor). It multiplies your glyph caches: glyph * subpixel offset. It requires knowing your foreground and background colors at render time, which is an unnatural requirement when you want to do GPU-accelerated compositing. There's tons of ways to fall off of the subpixel antialiased quality path, and there's weird graphical artifacts when switching from static to animated text, or the other way. What a pain!
Nevertheless there's no denying that subpixel-AA text looks better on 1x displays. Everyone notices when it's not working, and macOS will look worse without it (on 1x displays).
Now I have no proof about this person being an ex-Apple engineer but what they say makes sense. So it‘s reasonable to get rid of it. Maybe not as soon as Mojave (2018) when the last non-Retina Mac was released on 2017 (MacBook Air), but definitely by that last Mac’s end of life (2023-2024).
 
My point was that Apple did support subpixel aliasing for font rendering for a looooong time. It was pretty much required for old displays. They stopped doing it on macOS Mojave (2018). Why? A couple reasons come to mind:
- Apple is evil (the favorite option at the other place).
- The feature has been around for so long that the code implementing it was old, refactoring that code for what is essentially a ”legacy” feature couldn’t be justified, and engineers wanted to get rid of it.
- Getting rid of it helped achieve some other goal, like greatly simplifying existing code paths and speeding up development of other related features.

A quick Google search seems to indicate that the winner is option 3. From Hacker News:

Now I have no proof about this person being an ex-Apple engineer but what they say makes sense. So it‘s reasonable to get rid of it. Maybe not as soon as Mojave (2018) when the last non-Retina Mac was released on 2017 (MacBook Air), but definitely by that last Mac’s end of life (2023-2024).
Apple has always been rather quick to discharge itself of no longer “needed” vestigial components, be they software or hardware. Especially if they’re troublesome which it sounds like this was. And I get it, all their new displays shipped with “retina” displays and at that time the only headless machines were the cheap minis and the niche Pro towers.

But if they thought the industry was going to follow suit as it did in mobile, well it didn’t. Probably more’s the pity, I can’t imagine producing 5K displays at scale would be that much more expensive than 4K, certainly not relative to what we see today. But for the reasons mentioned in my previous post I can equally understand why the industry didn’t follow Apple in a high dpi race as what happened with the mobile market. Apple couldn’t really put pressure on monitors like it could in mobile or even laptops. So the industry got to 4K and largely stopped (color is a different matter interestingly). And we are where we are.
 
Or maybe that weird-ass looking Intel Power Gadget clone isn’t reporting accurately.
I would say this tool is probably mostly okay. He's getting the raw data by running powermetrics and parsing its output. Early on I noticed a discrepancy between some of the graphs and what an instance of powermetrics was saying in a terminal window, reported it to him, and he fixed the problem. (Something to do with how he was averaging iirc, I forget the exact details.)

Should we completely trust powermetrics? Personally I don't, since I've seen too many odd things in its output, but I do think it's a reasonable approximation of the truth for many purposes.
 
But if they thought the industry was going to follow suit as it did in mobile, well it didn’t. Probably more’s the pity, I can’t imagine producing 5K displays at scale would be that much more expensive than 4K, certainly not relative to what we see today. But for the reasons mentioned in my previous post I can equally understand why the industry didn’t follow Apple in a high dpi race as what happened with the mobile market. Apple couldn’t really put pressure on monitors like it could in mobile or even laptops. So the industry got to 4K and largely stopped (color is a different matter interestingly). And we are where we are.
Yeah it’s a shame we got stuck on 4K instead of 5K. Seems like the high-end monitor ended up on a race for higher refresh rates instead of higher DPI, and we got stuck with 4K displays. Around 2015 there were many monitors released with 27” 5K, but the public interest must have been mild at best because most of them were dropped and replaced by 4K versions in the next couple years. It’s only recently that we’ve seen more 5K monitors available.

I‘m happy with companies supporting old stuff when it doesn’t restrict further developments (I believe you can still connect a 1st gen iPod to macOS Sonoma and manage the songs in it) but some things are just roadblocks to improving the user experience to virtually all users. IMHO it’s worth it to annoy a few users by dropping a dead weight if it’s going to improve the experience for the vast majority of users.
 
I would say this tool is probably mostly okay. He's getting the raw data by running powermetrics and parsing its output. Early on I noticed a discrepancy between some of the graphs and what an instance of powermetrics was saying in a terminal window, reported it to him, and he fixed the problem. (Something to do with how he was averaging iirc, I forget the exact details.)

Should we completely trust powermetrics? Personally I don't, since I've seen too many odd things in its output, but I do think it's a reasonable approximation of the truth for many purposes.
Yeah I have no reason to doubt it really. Just found it funny that someone decided to copy the interface from Intel Power Gadget 🙂
 
The problem is according to Apple that isn’t “retina” at that size - it has to be 5K at 27”. And apparently text at 4K looks fuzzy coming out of macOS. I don’t have one myself, but I’ve seen this reported multiple times including in this thread. Some people aren’t bothered by it, others are.

Btw I agree with you that 4K should be more than fine and Apple should treat 4K 27” in macOS as a first class viewing standard, but apparently they don’t.
I've run my Macs on 4k 27" and looks fine to me.
 
I don’t want to honor it with a view lol
Is the complaint just the usual “it gets to 108c before the fans ramp up”, or is there more to it?
Yes, it is 108C before fans ramp up. After it is 90-95C.
The video is unsuprisingly really bad. He tests score from Cinebench while running Mx power gadget, some fan control app on the background. I did not see the whole video only some parts. It was as usual, lies and misleading info.
 
Last edited:
The feature has been around for so long that the code implementing it was old, refactoring that code for what is essentially a ”legacy” feature couldn’t be justified, and engineers wanted to get rid of it.
I suspect this it the reason. The font rendering API(s!) was a mishmash for a long time, only because it was a high priority in early Apple days. More recently, they just yeeted it and I suppose expected people to just climb on board the Retina train. As far as I know, they haven’t readdressed it since.
 
I would say this tool is probably mostly okay. He's getting the raw data by running powermetrics and parsing its output. Early on I noticed a discrepancy between some of the graphs and what an instance of powermetrics was saying in a terminal window, reported it to him, and he fixed the problem. (Something to do with how he was averaging iirc, I forget the exact details.)

Should we completely trust powermetrics? Personally I don't, since I've seen too many odd things in its output, but I do think it's a reasonable approximation of the truth for many purposes.
Yeah powermetrics can be a bit wonky sometimes and should be sanity checked by wall power measures at least (which has its own issues obviously) but is often reasonable.

This is absolute bullshit 🙄

Eh, that’s not too far off, IMHO.

I lean more towards @Aaronage’s opinion myself. Yes Apple has very good memory compression and fast SSDs make swap far more bearable than the bad old days of spinning rust but to say 8GB is the equivalent to 16GB on a PC? Hmmm … that’s BS marketing rationalization unless backed up by extraordinary evidence.
 
Last edited:
Yeah powermetrics can be a bit wonky sometimes and should be sanity checked by wall power measures at least (which has its own issues obviously) but is often reasonable.





I lean more towards @Aaronage’s opinion myself. Yes Apple has very good memory compression and fast SSDs make swap far more bearable than the bad old days of spinning rust but to say 8GB is the equivalent to 16GB on a PC? Hmmm … that’s BS marketing rationalization unless backed up by extraordinary evidence.
Yes. I’m no expert in windows memory management, however this statement is too vague to mean much. Equal in what, which application etc.

I do think the Apple Silicon macs are good at dealing with swap generally, but I still think they should provide more ram on all their macs.
 
Back
Top