M3 core counts and performance

Jimmyjames

Site Champ
Posts
772
Reaction score
872
Oh man, now I want to watch it to pick it apart. I must resist šŸ§˜ā€ā™‚ļø

My gut feeling is the testing is wrong and/or theyā€™re drawing a conclusion with insuffcient evidence. Thereā€™s no way it isnā€™t hitting the same ST clocks as M3 Pro and Max - thereā€™s no difference in ST performance. Plus, I find it hard to believe the cooling solution could be so bad that it canā€™t maintain peak ST clocks.
Maybe heā€™s deliberately feigning ignorance about clocks - e.g. expecting it to run 4.05GHz on all cores despite every prior M SoC having two clock states for ST and MT.. Or maybe that weird-ass looking Intel Power Gadget clone isnā€™t reporting accurately.

Anyway, Iā€™m just speculating on a video I havenā€™t watched now so I should shut up šŸ˜‚
Yeah, I think they haven't tested anything other than Cinebench. Perhaps thatā€™s why it didnā€™t hit 4ghz?
 

dada_dave

Elite Member
Posts
2,300
Reaction score
2,324
This is true but c'mon, the first Retina Mac was released in 2012. The last non-Retina Mac was the 2017 MacBook Air. It's just maddening sometimes how far behind the tech world lags in some things. I can't believe there's still so few high-DPI displays out there.


Similar displays are not that much cheaper. The 27" 5K UltraFine is $1200 already, with much worse build quality. Samsung's suspiciously-Studio-Display-looking ViewFinity S9 is $1600.
I think these panels are just expensive, maybe due to being unusual configurations outside of the Apple world. The last Thunderbolt Display was $999 and the build quality was similar to the Studio Display, there's gotta be a reason for the huge price bump.
5K displays are even harder than 4K to drive, sure GPUs have gotten more powerful so that shouldnā€™t be as a big of a deal but for a lot of office work computers with small iGPUs it might make a difference still. On top of that standard techniques which I guess Apple refuses to use can make text look great at 4K. Sure it would look even better at 5K, but youā€™re fighting against good enough for a display. I know 8K TV screens are a thing now so maybe higher resolution displays will eventually make their way to the monitor world but unlike for movies and mobile it just isnā€™t seen as a much of a necessity. 4K seems to have been where the current sweet spot settled wether by accident of history or technical reasons.
 

Yoused

up
Posts
5,701
Reaction score
9,100
Location
knee deep in the road apples of the 4 horsemen
Oh man, now I want to watch it to pick it apart. I must resist šŸ§˜ā€ā™‚ļø
Yeah, it is mostly blather. I watched it with captions, muted, probably the best plan for those guys ā€“ for a lot of reviewers, really. And the in-video ad was heard best not-heard.
My gut feeling is the testing is wrong and/or theyā€™re drawing a conclusion with insuffcient evidence. Thereā€™s no way it isnā€™t hitting the same ST clocks as M3 Pro and Max - thereā€™s no difference in ST performance.

One of his points was the 14" has a single, smaller fan rather than the 2 larger ones from before, and the heatsink is smaller. He ran "our stress test",whatever that is, but I imagine it loads up the processor so all the cores are running. AIUI, M3 will not hit 4.0 if more than one P-core is running, so no surprise it showed slower.
 

dada_dave

Elite Member
Posts
2,300
Reaction score
2,324
You guys know you can get a "retina quality" display (in terms of resolution), or close to it in terms of screen resolution in a 27" 4k IPS monitor for like... 300 bucks, right?
The problem is according to Apple that isnā€™t ā€œretinaā€ at that size - it has to be 5K at 27ā€. And apparently text at 4K looks fuzzy coming out of macOS. I donā€™t have one myself, but Iā€™ve seen this reported multiple times including in this thread. Some people arenā€™t bothered by it, others are.

Btw I agree with you that 4K should be more than fine and Apple should treat 4K 27ā€ in macOS as a first class viewing standard, but apparently they donā€™t.
 

quarkysg

Power User
Posts
76
Reaction score
55
You guys know you can get a "retina quality" display (in terms of resolution), or close to it in terms of screen resolution in a 27" 4k IPS monitor for like... 300 bucks, right?
I'm running my M1 Mini connected to a LG 32" 4K panel, scaled to 1440p effective resolution. Works fine for me.

For 4K 27" panel, to go "retina" you have to set it to scale to 1080p effective resolution, but all elements will appear big. Full 4K effective resolution will be too small to see clearly. So for me, the only effective scaled resolution is 1440p for a 4K panel when connected to recent Macs.
 

dada_dave

Elite Member
Posts
2,300
Reaction score
2,324
I'm running my M1 Mini connected to a LG 32" 4K panel, scaled to 1440p effective resolution. Works fine for me.

For 4K 27" panel, to go "retina" you have to set it to scale to 1080p effective resolution, but all elements will appear big. Full 4K effective resolution will be too small to see clearly. So for me, the only effective scaled resolution is 1440p for a 4K panel when connected to recent Macs.
Good to know that it looks good to you! Iā€™ve seen people complain. Iā€™m interested in trying myself but definitely curious about others opinions.
 
Last edited:

Andropov

Site Champ
Posts
654
Reaction score
850
Location
Spain
On top of that standard techniques which I guess Apple refuses to use can make text look great at 4K. Sure it would look even better at 5K, but youā€™re fighting against good enough for a display. I know 8K TV screens are a thing now so maybe higher resolution displays will eventually make their way to the monitor world but unlike for movies and mobile it just isnā€™t seen as a much of a necessity. 4K seems to have been where the current sweet spot settled wether by accident of history or technical reasons.
My point was that Apple did support subpixel aliasing for font rendering for a looooong time. It was pretty much required for old displays. They stopped doing it on macOS Mojave (2018). Why? A couple reasons come to mind:
- Apple is evil (the favorite option at the other place).
- The feature has been around for so long that the code implementing it was old, refactoring that code for what is essentially a ā€legacyā€ feature couldnā€™t be justified, and engineers wanted to get rid of it.
- Getting rid of it helped achieve some other goal, like greatly simplifying existing code paths and speeding up development of other related features.

A quick Google search seems to indicate that the winner is option 3. From Hacker News:
ex-MacOS SWE here. Subpixel antialiasing is obnoxious to implement. It requires threading physical pixel geometry up through multiple graphics layers, geometry which is screen-dependent (think multi-monitor). It multiplies your glyph caches: glyph * subpixel offset. It requires knowing your foreground and background colors at render time, which is an unnatural requirement when you want to do GPU-accelerated compositing. There's tons of ways to fall off of the subpixel antialiased quality path, and there's weird graphical artifacts when switching from static to animated text, or the other way. What a pain!
Nevertheless there's no denying that subpixel-AA text looks better on 1x displays. Everyone notices when it's not working, and macOS will look worse without it (on 1x displays).
Now I have no proof about this person being an ex-Apple engineer but what they say makes sense. So itā€˜s reasonable to get rid of it. Maybe not as soon as Mojave (2018) when the last non-Retina Mac was released on 2017 (MacBook Air), but definitely by that last Macā€™s end of life (2023-2024).
 

dada_dave

Elite Member
Posts
2,300
Reaction score
2,324
My point was that Apple did support subpixel aliasing for font rendering for a looooong time. It was pretty much required for old displays. They stopped doing it on macOS Mojave (2018). Why? A couple reasons come to mind:
- Apple is evil (the favorite option at the other place).
- The feature has been around for so long that the code implementing it was old, refactoring that code for what is essentially a ā€legacyā€ feature couldnā€™t be justified, and engineers wanted to get rid of it.
- Getting rid of it helped achieve some other goal, like greatly simplifying existing code paths and speeding up development of other related features.

A quick Google search seems to indicate that the winner is option 3. From Hacker News:

Now I have no proof about this person being an ex-Apple engineer but what they say makes sense. So itā€˜s reasonable to get rid of it. Maybe not as soon as Mojave (2018) when the last non-Retina Mac was released on 2017 (MacBook Air), but definitely by that last Macā€™s end of life (2023-2024).
Apple has always been rather quick to discharge itself of no longer ā€œneededā€ vestigial components, be they software or hardware. Especially if theyā€™re troublesome which it sounds like this was. And I get it, all their new displays shipped with ā€œretinaā€ displays and at that time the only headless machines were the cheap minis and the niche Pro towers.

But if they thought the industry was going to follow suit as it did in mobile, well it didnā€™t. Probably moreā€™s the pity, I canā€™t imagine producing 5K displays at scale would be that much more expensive than 4K, certainly not relative to what we see today. But for the reasons mentioned in my previous post I can equally understand why the industry didnā€™t follow Apple in a high dpi race as what happened with the mobile market. Apple couldnā€™t really put pressure on monitors like it could in mobile or even laptops. So the industry got to 4K and largely stopped (color is a different matter interestingly). And we are where we are.
 

mr_roboto

Site Champ
Posts
308
Reaction score
513
Or maybe that weird-ass looking Intel Power Gadget clone isnā€™t reporting accurately.
I would say this tool is probably mostly okay. He's getting the raw data by running powermetrics and parsing its output. Early on I noticed a discrepancy between some of the graphs and what an instance of powermetrics was saying in a terminal window, reported it to him, and he fixed the problem. (Something to do with how he was averaging iirc, I forget the exact details.)

Should we completely trust powermetrics? Personally I don't, since I've seen too many odd things in its output, but I do think it's a reasonable approximation of the truth for many purposes.
 

Andropov

Site Champ
Posts
654
Reaction score
850
Location
Spain
But if they thought the industry was going to follow suit as it did in mobile, well it didnā€™t. Probably moreā€™s the pity, I canā€™t imagine producing 5K displays at scale would be that much more expensive than 4K, certainly not relative to what we see today. But for the reasons mentioned in my previous post I can equally understand why the industry didnā€™t follow Apple in a high dpi race as what happened with the mobile market. Apple couldnā€™t really put pressure on monitors like it could in mobile or even laptops. So the industry got to 4K and largely stopped (color is a different matter interestingly). And we are where we are.
Yeah itā€™s a shame we got stuck on 4K instead of 5K. Seems like the high-end monitor ended up on a race for higher refresh rates instead of higher DPI, and we got stuck with 4K displays. Around 2015 there were many monitors released with 27ā€ 5K, but the public interest must have been mild at best because most of them were dropped and replaced by 4K versions in the next couple years. Itā€™s only recently that weā€™ve seen more 5K monitors available.

Iā€˜m happy with companies supporting old stuff when it doesnā€™t restrict further developments (I believe you can still connect a 1st gen iPod to macOS Sonoma and manage the songs in it) but some things are just roadblocks to improving the user experience to virtually all users. IMHO itā€™s worth it to annoy a few users by dropping a dead weight if itā€™s going to improve the experience for the vast majority of users.
 

Aaronage

Power User
Posts
144
Reaction score
213
I would say this tool is probably mostly okay. He's getting the raw data by running powermetrics and parsing its output. Early on I noticed a discrepancy between some of the graphs and what an instance of powermetrics was saying in a terminal window, reported it to him, and he fixed the problem. (Something to do with how he was averaging iirc, I forget the exact details.)

Should we completely trust powermetrics? Personally I don't, since I've seen too many odd things in its output, but I do think it's a reasonable approximation of the truth for many purposes.
Yeah I have no reason to doubt it really. Just found it funny that someone decided to copy the interface from Intel Power Gadget šŸ™‚
 

throAU

Site Champ
Posts
275
Reaction score
293
Location
Perth, Western Australia
The problem is according to Apple that isnā€™t ā€œretinaā€ at that size - it has to be 5K at 27ā€. And apparently text at 4K looks fuzzy coming out of macOS. I donā€™t have one myself, but Iā€™ve seen this reported multiple times including in this thread. Some people arenā€™t bothered by it, others are.

Btw I agree with you that 4K should be more than fine and Apple should treat 4K 27ā€ in macOS as a first class viewing standard, but apparently they donā€™t.
I've run my Macs on 4k 27" and looks fine to me.
 

Souko

Member
Posts
16
Reaction score
36
I donā€™t want to honor it with a view lol
Is the complaint just the usual ā€œit gets to 108c before the fans ramp upā€, or is there more to it?
Yes, it is 108C before fans ramp up. After it is 90-95C.
The video is unsuprisingly really bad. He tests score from Cinebench while running Mx power gadget, some fan control app on the background. I did not see the whole video only some parts. It was as usual, lies and misleading info.
 
Last edited:

Altaic

Power User
Posts
146
Reaction score
182
The feature has been around for so long that the code implementing it was old, refactoring that code for what is essentially a ā€legacyā€ feature couldnā€™t be justified, and engineers wanted to get rid of it.
I suspect this it the reason. The font rendering API(s!) was a mishmash for a long time, only because it was a high priority in early Apple days. More recently, they just yeeted it and I suppose expected people to just climb on board the Retina train. As far as I know, they havenā€™t readdressed it since.
 

dada_dave

Elite Member
Posts
2,300
Reaction score
2,324
I would say this tool is probably mostly okay. He's getting the raw data by running powermetrics and parsing its output. Early on I noticed a discrepancy between some of the graphs and what an instance of powermetrics was saying in a terminal window, reported it to him, and he fixed the problem. (Something to do with how he was averaging iirc, I forget the exact details.)

Should we completely trust powermetrics? Personally I don't, since I've seen too many odd things in its output, but I do think it's a reasonable approximation of the truth for many purposes.
Yeah powermetrics can be a bit wonky sometimes and should be sanity checked by wall power measures at least (which has its own issues obviously) but is often reasonable.

This is absolute bullshit šŸ™„

Eh, thatā€™s not too far off, IMHO.

I lean more towards @Aaronageā€™s opinion myself. Yes Apple has very good memory compression and fast SSDs make swap far more bearable than the bad old days of spinning rust but to say 8GB is the equivalent to 16GB on a PC? Hmmm ā€¦ thatā€™s BS marketing rationalization unless backed up by extraordinary evidence.
 
Last edited:

Jimmyjames

Site Champ
Posts
772
Reaction score
872
Yeah powermetrics can be a bit wonky sometimes and should be sanity checked by wall power measures at least (which has its own issues obviously) but is often reasonable.





I lean more towards @Aaronageā€™s opinion myself. Yes Apple has very good memory compression and fast SSDs make swap far more bearable than the bad old days of spinning rust but to say 8GB is the equivalent to 16GB on a PC? Hmmm ā€¦ thatā€™s BS marketing rationalization unless backed up by extraordinary evidence.
Yes. Iā€™m no expert in windows memory management, however this statement is too vague to mean much. Equal in what, which application etc.

I do think the Apple Silicon macs are good at dealing with swap generally, but I still think they should provide more ram on all their macs.
 
Top Bottom
1 2