Yoused
up
- Joined
- Aug 14, 2020
- Posts
- 6,844
- Solutions
- 1
the numbers given are relative to M2 (not M3)
Well, that would probably be because the iPad Pro had the M2, not the M3.
the numbers given are relative to M2 (not M3)
I assume this is using the same cores we will see in the A18, which are likely new. I doubt they are wider, but they may have deeper reorder buffers and more flexibility in pipeline assignments and what is allowed to run in parallel. Add some more memory bandwidth and maybe some v9 support into the stew. Probably not a clean sheet redesign.I was wondering if it might improve the Apple pencil experience, people pay a lot of money to add textured covers to their iPads to get a more paper-like writing surface.
I was wondering the same, not clear. The only thing I can go on is that they described the CPU as "brand new" but the GPU as "building off the M3". So I lean towards a brand new CPU design, but that could just be video fluff or maybe a brand new design but still 9-wide decode. We won't know until launch.
It’s LPDDR5X I think.I wonder if the M4 is still using LPDDR5 RAM.
That's a good point. Plus, given that it's meant to be touched, it might be a different type of nanotexture from that used in the ASD and XDR.I was wondering if it might improve the Apple pencil experience, people pay a lot of money to add textured covers to their iPads to get a more paper-like writing surface.
I was wondering the same, not clear. The only thing I can go on is that they described the CPU as "brand new" but the GPU as "building off the M3". So I lean towards a brand new CPU design, but that could just be video fluff or maybe a brand new design but still 9-wide decode. We won't know until launch.
It would be shocking if they were able to deliver a new u-architecture within half a year. The CPU is likely the same, maybe with some balancing tweaks to take better advantage of the wider backend (It doesn’t seem that M3 gets too much out of it). Would be interesting to see. I’m most curious about the new “ML accelerators” though which could mean redesigned AMX units.
Found this from Ryan Smith at Anandtech:It’s LPDDR5X I think.
We don’t know yet, but if as suspected, the M4 gpu is a very minor upgrade, then it will be the third generation without a significant performance improvement in raw compute or raster. Certainly the M3 saw a big change in architecture with ray tracing, dynamic cache and mesh shaders, but we may be on the third iteration of mid 40000 geekbench scores.
Anyone concerned about this?
On the contrary, lately the addition of new features has been, if anything, more impressive than having raw performance numbers go up.We don’t know yet, but if as suspected, the M4 gpu is a very minor upgrade, then it will be the third generation without a significant performance improvement in raw compute or raster. Certainly the M3 saw a big change in architecture with ray tracing, dynamic cache and mesh shaders, but we may be on the third iteration of mid 40000 geekbench scores.
Anyone concerned about this?
yeah, the massive architectural improvements are far more important that improving performance. It’s easier to add more GPU cores and speed up their clocks than it is to completely change their architecture. And the changes they have made to reduce system power impact of using the GPUs will allow them to beef up the clocks/parallelism without a crazy power budget.On the contrary, lately the addition of new features has been, if anything, more impressive than having raw performance numbers go up.
And a few years from now, having all their devices of the last N years with extensive feature support will be more useful to lure game studios in than the raw power of the phone (within reason). It's not economically feasible to build for the latest phones only, not even for the latest iPhones. And having to support old GPUs with missing features is painful.
Oh man been there. Not on Anandtech, I think I posted about my surprise how bad they were there. It’s such a tough line for me, engage and get stressed, ignore and suffer the nagging feeling that I have to wade in!Ugh ... I'm exhausted so of course I stupidly responded to trolls on Anandtech and got stuck arguing that the no the M3 Max CPU really doesn't draw 100 watts by itself.
Apple Announces M4 SoC: Latest and Greatest Starts on 2024 iPad Pro
www.anandtech.com
Basically it's a wall power measurement on Cinebench R15, which yes gets to 93W (external screen). I'm trying to patiently explain that wall power measurements can vastly inflate power draw compared to software measurements like powermetrics (Mac) and HWinfo (PC) so comparing a wall power measurement (93W) with a eco mode rating of 65W (not even any kind of measurement) for the 7950X is nonsense and when measured at the wall on eco power the 7950X is, guess what?, higher than the highest power measurement I've ever seen for the M3 Max! Never mind that comparing the 7950X to the M3 Max even at comparable power is nonsense because one is a bigger chip with vastly more threads that doesn't go into laptops, not even in eco mode, and the M3 Max does pretty damn well even with that. You know how a threadripper pro at 105W would beat the pants off of a 7950X at 105W?
Sorry rant over.
Those guys had a thread about me over there in 2011. I felt so special.Ugh ... I'm exhausted so of course I stupidly responded to trolls on Anandtech and got stuck arguing that the no the M3 Max CPU really doesn't draw 100 watts by itself.
Thanks. To be fair the Pro Apple troll lemurbutton was being an ass - he wasn’t completely wrong but his comment was needless flamebait and that’s not first time he’s done that. He consistently deliberately provokes other trolls. Most of the time I ignore it all relatively easily but this time I’m so tired his antagonist trolls provoked me by spouting even worse idiocy. Were I moderator over there I’d nuke the whole thing. Possibly literally, just to make sure.Oh man been there. Not on Anandtech, I think I posted about my surprise how bad they were there. It’s such a tough line for me, engage and get stressed, ignore and suffer the nagging feeling that I have to wade in!
Edit. Damn just read some of it. That Terry person. The irresistible combination of condescension and ignorance. Hard to resist. Just ignore it and get some sleep. You can’t help them.
Edit 2. Now I see Maynard is joining in. Ahhhhhhhhhh
I didn’t read the entire thread, just a few posts from your link and had to stop after I felt my blood pressure raise dangerously!Thanks. To be fair the Pro Apple troll lemurbutton was being an ass - he wasn’t completely wrong but his comment was needless flamebait and that’s not first time he’s done that. He consistently deliberately provokes other trolls. Most of the time I ignore it all relatively easily but this time I’m so tired his antagonist trolls provoked me by spouting even worse idiocy. Were I moderator over there I’d nuke the whole thing. Possibly literally, just to make sure.
I jumped in early too. Mainly just the one guy. But he’s stubborn.Oh man been there. Not on Anandtech, I think I posted about my surprise how bad they were there. It’s such a tough line for me, engage and get stressed, ignore and suffer the nagging feeling that I have to wade in!
Edit. Damn just read some of it. That Terry person. The irresistible combination of condescension and ignorance. Hard to resist. Just ignore it and get some sleep. You can’t help them.
Edit 2. Now I see Maynard is joining in. Ahhhhhhhhhh
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.