Sonofabitch
The rumors have been all over the place. Unlike the PC side, you need a degree in Kremlinology to divine Apple's intentions. I wonder if this is the rumored 7K display, alleged by 9to5Mac, that will supposedly replace the XDR, which Ross doesn't believe it to be, or the monitor that sits between the Studio Display and XDR which was rumored to be $2,500. Regardless, unless you need a new monitor immediately, it might be wise to wait until after WWDC.Maybe hold off buying that display…
Yep. It may be a $2500 display with promotion and miniLED, which makes for a very interesting product. In my case that would be too big a price jump for those features, but, then, I’m not a pro in an industry where I do video or graphics work.The rumors have been all over the place. Unlike the PC side, you need a degree in Kremlinology to divine Apple's intentions. I wonder if this is the rumored 7K display, alleged by 9to5Mac, that will supposedly replace the XDR, which Ross doesn't believe it to be, or the monitor that sits between the Studio Display and XDR which was rumored to be $2,500. Regardless, unless you need a new monitor immediately, it might be wise to wait until after WWDC.
It's not very secret by now.I was wondering if this might be a "we left a secret in the M1 Max/Ultra" when they introduce Mac Pro.
Older iMacs and Apple displays could change between VESA and built-in stands. Bit of a shame they removed that.The Mac Studio Display comes in three different models, differentiated by the stand. The stands are non-removable from the display. The first is a no extra cost tilt only stand, or a VESA mount model. The tilt and height adjustable stand costs $399 extra.
Wonder how many returns Apple is going to get when people figure this all out?
https://www.macrumors.com/2022/03/09/studio-display-stands-not-interchangeable/
Maybe not so much. For the high-end Ultra, that would mean the equivalent of 4K "cores", which is a very long way from the 18K number I was seeing for some nVidia cards. Although, the power usage on those cards alone is in the 300W range, which is probably quite a bit higher than the peak draw of an entire Ultra SoC.
An Apple GPU core contains 4x 32-wide ALUs, for 128 ALUs or “shader cores/units” in total. How exactly these work and whether each ALU can execute different instruction stream is not clear as far as I know. The Ultra will therefore contain 8192 “shader cores/units” and should support close to 200000 threads in flight.
It is possible that I was seeing some leaks/speculation on the 40xx series and conflated that with the 30xx models.I’m not entirely sure where you got the 18K number for Nvidia.
M1 Ultra's specs show it as having 32-core Neural Engine, but since it's composed of two M1 Max, we know it really has 64 ANE cores in four clusters of 16 per cluster. If they needed to do any work on load balancing or APIs, it's already done. That means if they were going to enable all the clusters, they could've done so already.Maybe there's just no way (yet) to balance the load between the two Neural Engines, or they're working on the API to do it.
The interconnect at the bottom of the M1 Max wasn't exactly a secret, either (although it was less obvious, as Apple's die shots cut it off entirely).It's not very secret by now.
Older iMacs and Apple displays could change between VESA and built-in stands. Bit of a shame they removed that.
Yep. It may be a $2500 display with promotion and miniLED, which makes for a very interesting product. In my case that would be too big a price jump for those features, but, then, I’m not a pro in an industry where I do video or graphics work.
$2500 might be a low estimate considering the 32" XDR display is $5000 and this one would have a 12 MP webcam and speakers. But maybe the price of the XDR display will be coming down soon...
Good point.M1 Ultra's specs show it as having 32-core Neural Engine, but since it's composed of two M1 Max, we know it really has 64 ANE cores in four clusters of 16 per cluster. If they needed to do any work on load balancing or APIs, it's already done. That means if they were going to enable all the clusters, they could've done so already.
At least one other person here did. Looking forward to your reviews!So did anybody order one? I bit the bullet presuming that the 27" AS iMac is either no more or a long way out. Got the studio display and the base ultra
-d
At least one other person here did. Looking forward to your reviews!
Good! Random chess benchmarks are obviously the most important thingmid april unfortunately
I’ll be sure to run stockfish benchmarks lol
Over at the other place there's open warfare about this issue. There isn't even agreement about exactly what Apple was trying to portray in its vague performance slides. Notice that Apple is using "relative performance" on the y-axis, instead of simply raw performance. I get the feeling that an engineer was trying to explain a performance/watt advantage in regards to the Ultra relative to the 3090, to someone in marketing, and then marketing ran with it, resulting in this thing. I'm not certain exactly what Apple is trying to communicate here. The GPUs inside the M1 SoC family are already impressive enough as is, particularly considering that it's a first-generation product. So why bring up the 3090 at all, and instead continue to compare them to the previous GPUs inside Intel Macs, which seems to work heavily in their favor?The reviews of the Mac Studio are out. I have no idea why Apple claimed that the M1 Ultra was close to the 3090 and apparently neither has any reviewer. Too bad Andrei is no longer in AnandTech.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.