M3 core counts and performance

I find it kind of difficult to imagine that Apple will be able to maintain a strict SoC design across the board. The base model probably will continue to be, but above base Mx, they will go modular with interconnects, allowing selective core-pack, GPU and support modules for the various levels of performance. To the end user, the product will look the same, just more flexible (at point of sale).
 
I find it kind of difficult to imagine that Apple will be able to maintain a strict SoC design across the board. The base model probably will continue to be, but above base Mx, they will go modular with interconnects, allowing selective core-pack, GPU and support modules for the various levels of performance. To the end user, the product will look the same, just more flexible (at point of sale).
Considering how few "SKUs" of chip Apple cares about, does this actually make sense? With the M2, they have 3 dies that covers pretty much everything they want. Some binning to stratify it a bit further. About the only oddity in this lineup is the Mac Pro, and that's probably where they they have reason to stretch their wings and do something different. Although if it's more that the Ultra becomes a 4th die in the lineup that can be interposed for the Mac Pro, maybe that's all they think they need in the short term?

Even when I was thinking Apple might go chiplets back at the initial announcement, I was thinking more in terms of the Mac Pro to get more die space so they could go ham, and get all the CPU/GPU and I/O they wanted for such a beast.
 
Considering how few "SKUs" of chip Apple cares about, does this actually make sense? With the M2, they have 3 dies that covers pretty much everything they want. Some binning to stratify it a bit further. About the only oddity in this lineup is the Mac Pro, and that's probably where they they have reason to stretch their wings and do something different. Although if it's more that the Ultra becomes a 4th die in the lineup that can be interposed for the Mac Pro, maybe that's all they think they need in the short term?

Even when I was thinking Apple might go chiplets back at the initial announcement, I was thinking more in terms of the Mac Pro to get more die space so they could go ham, and get all the CPU/GPU and I/O they wanted for such a beast.
I agree that in short term we likely won’t see a “Lego” design SOC, I do think it makes sense as it could allow Apple to expand its offerings. For instance, while Apple can offer compelling products at every tier it currently has (looks sideways at the Mac Pro), it cannot really compete with say midrange gaming or alternatively midrange CPU computing. Basically if a user wants powerful graphics or CPU performance they have to buy a machine with both regardless of whether or not they actually want both.

Apple really does seem interested in competing in the gaming market and while arguably having a decent base machine, which they do, is the most important aspect Apple doesn’t have the flexibility to offer a midrange gaming experience for a midrange price, especially on the desktop. To pull that off, they would need to have the ability to combine a lower end CPU with a higher end GPU - eg imagine a base M3 CPU with a cutdown M3 Max GPU and an M3 Pro’s base RAM. Don’t get me wrong Apple’s design are great for balanced usage and workstations, but there are significant gaps that they not only don’t make but actually can’t make.

So I think if the technology allows that sort of “Lego design” in the future it’s definitely something that they should consider. We’re not there yet and if the leaks from awhile ago are true, we won’t be for some time yet but I think it’s an interesting idea that would allow Apple to offer more SKUs while not necessarily increasing their work load or manufacturing costs (well it would just not as much designing and fabbing all those unique SOCs).
 
Last edited:
Watched this LTT video on the new Intel Ultra chips. Hard to watch for me because I dont like his style of presenting, and the other stuff!


It’s mostly a bloodbath in favour of the M3, but it’s weird as the video starts saying it’s discussion of chips, but then he excludes a 16GB model of M3 due to price of the laptop. Make up your mind!
One nice thing I did see in the video is I had no idea performance for AV1 cpu encoding within Handbrake had improved so much for Apple Silicon, the last time I looked, it was way behind Intel and AMD.
 
Watched this LTT video on the new Intel Ultra chips. Hard to watch for me because I dont like his style of presenting, and the other stuff!

This is closer to the LEGO SOC design I hope Apple adopts in the future but I also hope they keep the naming simpler … thankfully Apple doesn’t need so many product tiers.

It’s mostly a bloodbath in favour of the M3, but it’s weird as the video starts saying it’s discussion of chips, but then he excludes a 16GB model of M3 due to price of the laptop. Make up your mind!

Aye, one issue is of course that the M3 13/15” Air isn’t out. Another is that power levels weren’t reported except to compare the NPU to the GPU. Though I think few here would disagree that an 8GB base RAM is getting long in the tooth especially for the MacBook Pro.

One nice thing I did see in the video is I had no idea performance for AV1 cpu encoding within Handbrake had improved so much for Apple Silicon, the last time I looked, it was way behind Intel and AMD.

Indeed.
 
Wait, that is the 4+4+10 base model M3 stomping all over a midrange u7?
Yes and it is the H not the U variant ultra i7. I was … more than a little surprised.

Edit:



To be fair to Intel they’ve priced it appropriately and most of the tests weren’t strictly multicore but … that’s still not great that a 16-core 22 thread CPU that can theoretically go to 115W (65W TDP) (though the Zenbook as a thin and light might not be able or at least not sustainably) is only beating the M3 in CB24 multicore by 10% and loses badly in CB24 single core.

And of course there is the i7 165H which is very slightly faster but after that it’s only the i9 whose CPU isn’t really much better, again slightly faster MHz. All the top laptop CPUs are 6+6+2.

 
Last edited:
Watched this LTT video on the new Intel Ultra chips. Hard to watch for me because I dont like his style of presenting, and the other stuff!

Interesting, but as you say it's a bit hard to watch (how many sponsors and plugs can they throw into one video?).

I concur with Linus that Pro Apple machines should start at 16GB RAM. But on the other hand it is interesting that the M3 with 8GB managed to be the competitive against the Ultra H model.
It would have been interesting to have the power usage in combination with the benchmarks.
 

To be fair to Intel they’ve priced it appropriately and most of the tests weren’t strictly multicore but … that’s still not great that a 16-core 22 thread CPU that can theoretically go to 115W (65W TDP) (though the Zenbook as a thin and light might not be able or at least not sustainably) is only beating the M3 in CB24 multicore by 10% and loses badly in CB24 single core.
Well, that's just it - 115W is a "max turbo" thing, not sustained. Even 65W sustained is asking a lot.

I find it instructive to look at a different Intel spec, base power. Intel's definition of base power: "The time-averaged power dissipation that the processor is validated to not exceed during manufacturing while executing an Intel-specified high complexity workload at Base Frequency and at the junction temperature as specified in the Datasheet for the SKU segment and configuration." Base power for this Ultra 7 chip is 28W, and base frequency is 1.4 GHz for the performance cores, 900 MHz for the efficient cores. That's a big haircut from the 115W max turbo frequencies! (4.8 GHz / 3.8 GHz)

In practice I expect these chips will usually manage a bit better than 1.4 / 0.9, because the spec has to be conservative. "Intel-specified high complexity workload" translates to "as nasty a power virus as we could craft", and most applications won't behave like that worst case.

Still, the base power being 28W is significant. Before Apple Silicon Macs, 13" MBPs used 28W TDP CPUs, and that was when Intel TDP actually meant something. 28W is a good target for laptops in roughly the class of the 13" MBP, because even if you try to make something higher work thermally, you'll end up with uselessly short battery life at maximum load.

So it's not too shocking to me that this chip disappoints. Its paper spec is technically true but no serious system integrator is actually going to let it run at 65W (much less 115W) in anything but short bursts, and you'll end up with performance somewhere closer to the "base" specs.
 
on the other hand it is interesting that the M3 with 8GB managed to be the competitive against the Ultra H model.


It’s interesting but not new that it is competitive with 16 GB Intel. i talked about it before.

a DIRECT QUOTE from one of their videos titled:

“Apple M1 Mac’s 8GB vs 16 gb ram - multitasking stress”
direct quote from them: “My final thoughts are The 8 GB model performs like a 16 GB Intel MacBook Pro. And the 16 GB model performs more like a 32GB Intel MacBook Pro or maybe even better than that honestly. It performs extremely well under all of that it was still doing things really fast”
 
Last edited:
Well, that's just it - 115W is a "max turbo" thing, not sustained. Even 65W sustained is asking a lot.

I find it instructive to look at a different Intel spec, base power. Intel's definition of base power: "The time-averaged power dissipation that the processor is validated to not exceed during manufacturing while executing an Intel-specified high complexity workload at Base Frequency and at the junction temperature as specified in the Datasheet for the SKU segment and configuration." Base power for this Ultra 7 chip is 28W, and base frequency is 1.4 GHz for the performance cores, 900 MHz for the efficient cores. That's a big haircut from the 115W max turbo frequencies! (4.8 GHz / 3.8 GHz)

In practice I expect these chips will usually manage a bit better than 1.4 / 0.9, because the spec has to be conservative. "Intel-specified high complexity workload" translates to "as nasty a power virus as we could craft", and most applications won't behave like that worst case.

Still, the base power being 28W is significant. Before Apple Silicon Macs, 13" MBPs used 28W TDP CPUs, and that was when Intel TDP actually meant something. 28W is a good target for laptops in roughly the class of the 13" MBP, because even if you try to make something higher work thermally, you'll end up with uselessly short battery life at maximum load.

So it's not too shocking to me that this chip disappoints. Its paper spec is technically true but no serious system integrator is actually going to let it run at 65W (much less 115W) in anything but short bursts, and you'll end up with performance somewhere closer to the "base" specs.
Sorry I understood that and I should’ve used a different word than sustainable there. I meant more like reliable - in other words it might hit that burst once but subsequent bursts won’t hit that because the laptop will be too hot. And I agree that even 65W sustained is likely an issue for a laptop of its size which will be impacted by the long CB24 test. For the record its GB6 multicore test which is bursty was worse, only matching the M3. Though we know that GB6 “penalizes” many core systems by emphasizing cooperative workloads where threads have to communicate, that’s still not great. It’s a pity SPEC results will be hard to find.

I think another factor why it disappoints is LTT mentioned they actually lowered max clocks for the first time. Given how reliant they are on clocks, that means some of their single core results are actually worse than even their own older generation and much worse than Apple’s current.

It’s interesting but not new that it is competitive with 16 GB Intel. i talked about it before.

So good news and bad news here. Bad news first: many of the tests aren’t particularly memory intensive, when it does get memory intensive you start to see problems creep in (my guess is the 1% lows on Pharaoh) and outright stops working (Pugetbench).

The good news: The 8GB MBPro was actually competing against a 32GB Intel machine never mind a 16GB one. ;) But that it does have 32GB at its price point is one of the reasons why Apple needs to up their base model’s RAM. The competition has moved on, Apple needs to as well. I say this as someone who bought that very model but that was a definite sticking point.
 
Last edited:
The good news: The 8GB MBPro was actually competing against a 32GB Intel machine never mind a 16GB one. ;) But that it does have 32GB at its price point is one of the reasons why Apple needs to up their base model’s RAM. The competition has moved on, Apple needs to as well. I say this as someone who bought that very model but that was a definite sticking point.
It's long past due for Apple to get more competitive on this front. I know they want to preserve the huge profit margin they make off underspeccing base configs so people end up spending $200 on the bump, but it's gotten so far out of touch with where the rest of the market is.
 
It's long past due for Apple to get more competitive on this front. I know they want to preserve the huge profit margin they make off underspeccing base configs so people end up spending $200 on the bump, but it's gotten so far out of touch with where the rest of the market is.
I remember when the M1 first came out and pricing it against its competition and feeling the base models were quite competitive in performance and even in specs. Three years later I have to admit the performance, especially in laptops is still great, but the RAM and storage tiers definitely need to be reworked. Apple has a tendency to hold onto storage and RAM specs longer than they should.

I have no issue with the Max starting at 36 of RAM GB mind you or even 1 TB of storage, but $400 for 2TB is a terrible upgrade price. And again I say this as someone who has just upgraded to that very model for myself with that config, that was a little painful. (My wife and I both upgraded our computers) So for some tiers, the base model is fine but some of the upgrades are not and for some the base model itself needs a rework.

It may be in vain, but I’m hoping Apple at least adds some RAM to the base model M4 next generation - even 12 GB would be something. Sure 16 GB would be better but I’m not so optimistic that Apple will double capacity on a single generation. Or at least just like the M3 MacBook Pro starts at the higher GPU core count it should also start at a higher RAM capacity. Something.
 
"Sure 16 GB would be better but I’m not so optimistic that Apple will double capacity on a single generation."

I would assume that would depend a lot on competing priorities of the design and how the price and space of RAM evolves?
 
"Sure 16 GB would be better but I’m not so optimistic that Apple will double capacity on a single generation."

I would assume that would depend a lot on competing priorities of the design and how the price and space of RAM evolves?
Sure, but my gut feeling looking at all those variables (that I can) and comparing that with Apple’s history is that I just don’t feel it’s likely that Apple would go from 8 to 16 GB base in the M4 generation. Obviously I’d be happy to be wrong. I’m just not going to set myself up for disappointment when it’s almost as likely that they stay at 8GB for the M4. Going to 12GB though? That I can see as being more likely than either of the two, but again that’s just a gut feeling.
 
Cliff could probably give us an idea what it costs to develop a new, competitive SoC, which Apple seems to have coming out yearly, and they only go into Apple products. That is their main justification for their boutique prices. It almost seems like they could tape out a SoC for the general market that would be around 80% the performance of their own devices (with an unhackable clock limiter) to defray the dev expense while also pantsing Qualcomm, et al. More quality ARM out there would be a good thing.
 
Cliff could probably give us an idea what it costs to develop a new, competitive SoC, which Apple seems to have coming out yearly, and they only go into Apple products.

No I can’t :)

My only guess is that Apple’s expenses are probably similar to AMD’s, with similarly-sized design teams and cost structure. But that guess could be wildly wrong.
 
Maynard Handley thinks the apparent lack of UltraFusion could herald the arrival of a new interconnect design that will allow multiple dies:

 
I've got the Apple Studio display and a 27" LG 4k next to it. I've not really had any issue at all except the Mac sometimes doesn't see the second monitor. Very rarely.

I think that, for the most part, the Samsung is decent gear, just at the wrong price to compete with Apple quality. That being said, the plastic wobbly housing is a bit shit.


When I was a kid, I wanted to be big and strong like the professional wrestlers that I watched on television. Now that I often sit at my computer desk for extended periods, I'm quite pleased to be completely average in build. I've never had issues with monitor height adjustment, either.


I've mentioned before how my current Mac setup is held together with sticks, bubble gum, and unicorn tears. The speakers I use are a cheap Logitech pair that I got for $1 at a yard sale. Anything I upgrade to will be better. My next Mac isn't going to be a cheap stopgap, I'm just putting it off for as long as possible.


Flattery will get you everywhere, my friend. However, as a self-admitted small brain person, the main reason I visit TechBoards is the hope that all of the smart folks here will someday make me into a big brain.


I think it depends upon personal tolerance. There are users who are perfectly fine with standard definition 1080p monitors with macOS. Then there are folks like me who obsess over having a ~220ppi "Retina" display to go alongside my Mac. Unfortunately, the options are limited and quite expensive. Hence, I've been babying my 21.5-inch UltraFine. Getting it brand new for $342 was ridiculous, in hindsight.

I'm running my M1 Mini connected to a LG 32" 4K panel, scaled to 1440p effective resolution. Works fine for me.

For 4K 27" panel, to go "retina" you have to set it to scale to 1080p effective resolution, but all elements will appear big. Full 4K effective resolution will be too small to see clearly. So for me, the only effective scaled resolution is 1440p for a 4K panel when connected to recent Macs.

My point was that Apple did support subpixel aliasing for font rendering for a looooong time. It was pretty much required for old displays. They stopped doing it on macOS Mojave (2018). Why? A couple reasons come to mind:
- Apple is evil (the favorite option at the other place).
- The feature has been around for so long that the code implementing it was old, refactoring that code for what is essentially a ”legacy” feature couldn’t be justified, and engineers wanted to get rid of it.
- Getting rid of it helped achieve some other goal, like greatly simplifying existing code paths and speeding up development of other related features.

A quick Google search seems to indicate that the winner is option 3. From Hacker News:

Now I have no proof about this person being an ex-Apple engineer but what they say makes sense. So it‘s reasonable to get rid of it. Maybe not as soon as Mojave (2018) when the last non-Retina Mac was released on 2017 (MacBook Air), but definitely by that last Mac’s end of life (2023-2024).

I've run my Macs on 4k 27" and looks fine to me.

Hey everyone, I just wanted to say thanks for the discussion of monitors from awhile ago, it helped a lot. I actually ended up getting a 4K 27" monitor which I am really happy with, has a nice KVM switch so I can also use it on a Linux box and seems great at 1440p scaling. The only thing is that the scaling from 4K to 1440p isn't a clean multiple so I get a warning that using a scaled resolution might affect performance. I'm on a M3 Max so I think I'll be okay. :) I think my confusion was that I was conflating some more severe problems (teething issues on the M1 and also that macOS/AS doesn't handle ultra-wide monitors great, or at least I see a lot of complaints about that) with those who are too used to retina screens and can't go back. I'm upgrading from an actual 1440p monitor 27" iMac and another 1080p 27" monitor for my Linux box (which I wasn't using much, it was headless and the monitor only brought out if I really need it) ... so this looks fantastic to me! And being able to control both computers from a single monitor is just really nice. Huge improvement there.
 
I think the "scaled resolution may impact performance" is a hang-over from when early retina Intel Macs had crappy integrated GPUs at the low end.

I have noticed zero performance issues with scaled resolutions in the past 10 years or thereabouts.
 
Back
Top