I guess my expectations were lowerYeah I saw. Just well below my expectations. I just hope they deliver with M5 cause AMD won't stand still.

I guess my expectations were lowerYeah I saw. Just well below my expectations. I just hope they deliver with M5 cause AMD won't stand still.
Well the gpu is TBDR so in that sense they do all use it. I suppose I should be more accurate and say that few if any AAA games use tile shaders, or optimise for Apple’s architecture.I thought all Metal API games were TBDR. You have to do something special for that to be the case?
What did you think it would be?Yeah I saw. Just well below my expectations. I just hope they deliver with M5 cause AMD won't stand still.
M4 Max at 1440p60fps with RT set to medium. I guess they were too high of expectations. The GPU is really something Apple should spend very high R&D on. Its important to so many of their future product stack and they depend on a good GPU core.What did you think it would be?
Agree on Ultra. I think the Max gives a decent uplift generally.The price also hurts, the higher SKUs ie higher GPU core counts of Max/Ultra don't reflect the performance small % you gain.
I don’t think the gpu core is a problem. They simply aren’t going to prioritise performance over efficiency. So I get it, for those wanting more of performance, especially on the desktop, it’s disappointing.M4 Max at 1440p60fps with RT set to medium. I guess they were too high of expectations. The GPU is really something Apple should spend very high R&D on. Its important to so many of their future product stack and they depend on a good GPU core.
I thought all Metal API games were TBDR. You have to do something special for that to be the case?
So the hardware is always TBDR. In rendering targeting the pre-Apple Silicon SDKs of macOS (I believe that would be Catalina and earlier off top my head), the driver will do extra work to emulate immediate mode rendering to not cause rendering artefacts from incorrect assumptions about undefined behaviour in the APIs. At least this used to be the case, not sure if that's been removed.I have to admit I’m confused by this as well. As far as I can tell, yes the GPU is always trying to leverage the TBDR capability and avoid rendering objects it doesn’t have to, but unless the graphics engine is properly tuned, the GPU may not be able to do so effectively. I have little personal knowledge and zero experience so I’m not sure why, but I’ve seen multiple people talking about TBDR as though some work is required by the developers to make it useful. I know some GPUs like Qualcomm even come with a switch that lets the GPU change rendering modes (including one that sounds like TBDR) so that developers can use the one that suits their engine. One could simply say that this just means there are tradeoffs for different rendering modes for different types of scenes (and there are), but taken together it seems like an engine itself can be better or worse for TBDR.
It’s not an efficiency problem. Yes, Apple GPU cores are using high density libraries but they are also bloated.I don’t think the gpu core is a problem. They simply aren’t going to prioritise performance over efficiency. So I get it, for those wanting more of performance, especially on the desktop, it’s disappointing.
Yeah definitely don’t agree with this overall take. The max does perform well on a variety of tasks.It’s not an efficiency problem. Yes, Apple GPU cores are using high density libraries but they are also bloated.
The GPU is massive part of M4 Max and yet the performance doesn’t reflect the cost or die area. The Max needs to have 80 GPU cores and obviously that’s not possible because a) they are too big and b) Apple needs to move to tiles and add a massive GPU if it won’t make its GPU cores more PPA friendly.
I mean Apple seems to give a bit of a fuck. Hence the promotion at WWDC etc. I think there’s a sensible middle ground between games don’t matter and gaming is all that matters.You gamers just don’t get it, do you? Apple’s SoC and GPU strategy is designed to meet the needs of 99% of their user base: that means efficiency and battery life. Nobody gives a fuck about Cyberpunk.
it does really well in blender, Cinebench etc. something about gaming makes it fall apart.Yeah definitely don’t agree with this overall take. The max does perform well on a variety of tasks.
we have a technical discussion, that’s it here. The people (including myself) in this thread like Apple products and want them to be the best. There is nothing wrong with comparing with alternatives out there in my opinion.Apple cares about the PR goodwill. And those same MacBook buyers who value efficiency and battery life may like to play some AAA games once in a while, but they will happily do so only at the low settings Apple’s SoCs allow.
It’s the gamer mindset that Apple has to match Nvidia in gaming specs or the company is doomed/M-series CPUs suck that drives me nuts.
Aye one issue is that it’s not clear to me which quality tier “very high fidelity” is - is that the same as ultra? The PC side is low medium high ultra. So what’s very high on the Mac? If it’s not the same, how big a quality difference is it?If I’m reading this correctly, it doesn’t look like the max is too different to the 4070 laptop in cp2077. Both get ~60fps qhd with upscaling. No need for panic! https://www.notebookcheck.net/NVIDIA-GeForce-RTX-4070-Laptop-GPU-vs-M4-Max-40-Core-GPU_11453_12886.247598.0.html#:~:text=NVIDIA,-GeForce RTX 4070 Laptop GPU:24.8
Apple is the industry leader in GPU perf/watt and it's not close. Power is one of the two "P"s in PPA, so claiming that Apple's GPU design isn't PPA friendly is wrong.It’s not an efficiency problem. Yes, Apple GPU cores are using high density libraries but they are also bloated.
The GPU is massive part of M4 Max and yet the performance doesn’t reflect the cost or die area. The Max needs to have 80 GPU cores and obviously that’s not possible because a) they are too big and b) Apple needs to move to tiles and add a massive GPU if it won’t make its GPU cores more PPA friendly.
As far as I can tell it isn't a 1 size preset. So on a M1 it could have medium textures, but low shadows and lighting, but on a M4 it could have medium textures and medium shadows and lighting. What is interesting is none of the "For This Mac" preset enable RT, where as on PC I am pretty sure it auto enables at least "regular" ray tracing if it detects you have an nvidia RTX GPU (I am not sure if it auto selects the RT preset on AMD cards).Aye one issue is that it’s not clear to me which quality tier “very high fidelity” is - is that the same as ultra? The PC side is low medium high ultra. So what’s very high on the Mac? If it’s not the same, how big a quality difference is it?
Hopefully reviewers will be able to manually set the Mac and PC to be identical and we’ll get a much better idea of what the performance actually is. Also curious about performance relative to Wine.
Setting | SSR = “Psycho" | SSR = Ultra | Percentage change. |
4K Ultra | 9.15 | 14.04 | 53% |
1440p Ultra | 21.07 | 30.87 | 47% |
1080p Ultra | 37.03 | 49.94 | 35% |
720p Ultra | 70.21 | 85.31 | 22% |
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.