Mac Studio

1646889360274.png


Maybe hold off buying that display…
 
Maybe hold off buying that display…
The rumors have been all over the place. Unlike the PC side, you need a degree in Kremlinology to divine Apple's intentions. I wonder if this is the rumored 7K display, alleged by 9to5Mac, that will supposedly replace the XDR, which Ross doesn't believe it to be, or the monitor that sits between the Studio Display and XDR which was rumored to be $2,500. Regardless, unless you need a new monitor immediately, it might be wise to wait until after WWDC.
 
The rumors have been all over the place. Unlike the PC side, you need a degree in Kremlinology to divine Apple's intentions. I wonder if this is the rumored 7K display, alleged by 9to5Mac, that will supposedly replace the XDR, which Ross doesn't believe it to be, or the monitor that sits between the Studio Display and XDR which was rumored to be $2,500. Regardless, unless you need a new monitor immediately, it might be wise to wait until after WWDC.
Yep. It may be a $2500 display with promotion and miniLED, which makes for a very interesting product. In my case that would be too big a price jump for those features, but, then, I’m not a pro in an industry where I do video or graphics work.
 
I was wondering if this might be a "we left a secret in the M1 Max/Ultra" when they introduce Mac Pro.
It's not very secret by now.

The Mac Studio Display comes in three different models, differentiated by the stand. The stands are non-removable from the display. The first is a no extra cost tilt only stand, or a VESA mount model. The tilt and height adjustable stand costs $399 extra.

Wonder how many returns Apple is going to get when people figure this all out?

https://www.macrumors.com/2022/03/09/studio-display-stands-not-interchangeable/
Older iMacs and Apple displays could change between VESA and built-in stands. Bit of a shame they removed that.
 
Maybe not so much. For the high-end Ultra, that would mean the equivalent of 4K "cores", which is a very long way from the 18K number I was seeing for some nVidia cards. Although, the power usage on those cards alone is in the 300W range, which is probably quite a bit higher than the peak draw of an entire Ultra SoC.

I’m not entirely sure where you got the 18K number for Nvidia. The 3090 claims 10496 CUDA cores, and the A6000 isn’t much bigger than that. https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090/

@leman at tOP said:
An Apple GPU core contains 4x 32-wide ALUs, for 128 ALUs or “shader cores/units” in total. How exactly these work and whether each ALU can execute different instruction stream is not clear as far as I know. The Ultra will therefore contain 8192 “shader cores/units” and should support close to 200000 threads in flight.

While not getting there the same way, the math matches what AnandTech guessed at the time of the M1 release (since Apple does not share details): https://www.anandtech.com/show/16252/mac-mini-apple-m1-tested/3

So it’s not 4K vs 18K as far as I can see, but ~8K vs ~10K.
 
There a couple of fucknuts on MR that literally spent all day in that display thread, not contributing in any meaningful way, simply "liking/loving" every single post critical of the design, from morning, till evening.

Wow. :ROFLMAO:

Don't get me wrong, I think that's a less than optimal design for the stand/mount, in an otherwise stellar looking product. One of the premium accessory manufacturers like Twelve South, should build a VESA compatible stand, that's matches the design.

Anyway, that's about all I have to say, and it took me a couple of minutes, one time vs. my __whole__freaking__day__. :ROFLMAO:
 
Maybe there's just no way (yet) to balance the load between the two Neural Engines, or they're working on the API to do it.
M1 Ultra's specs show it as having 32-core Neural Engine, but since it's composed of two M1 Max, we know it really has 64 ANE cores in four clusters of 16 per cluster. If they needed to do any work on load balancing or APIs, it's already done. That means if they were going to enable all the clusters, they could've done so already.
 
It's not very secret by now.


Older iMacs and Apple displays could change between VESA and built-in stands. Bit of a shame they removed that.
The interconnect at the bottom of the M1 Max wasn't exactly a secret, either (although it was less obvious, as Apple's die shots cut it off entirely).
 
Yep. It may be a $2500 display with promotion and miniLED, which makes for a very interesting product. In my case that would be too big a price jump for those features, but, then, I’m not a pro in an industry where I do video or graphics work.

$2500 might be a low estimate considering the 32" XDR display is $5000 and this one would have a 12 MP webcam and speakers. But maybe the price of the XDR display will be coming down soon...
 
$2500 might be a low estimate considering the 32" XDR display is $5000 and this one would have a 12 MP webcam and speakers. But maybe the price of the XDR display will be coming down soon...

I think the price will come down. By the time you add stand and nanocoating you're at $3500 anyway.
 
M1 Ultra's specs show it as having 32-core Neural Engine, but since it's composed of two M1 Max, we know it really has 64 ANE cores in four clusters of 16 per cluster. If they needed to do any work on load balancing or APIs, it's already done. That means if they were going to enable all the clusters, they could've done so already.
Good point.
 
So did anybody order one? I bit the bullet presuming that the 27" AS iMac is either no more or a long way out. Got the studio display and the base ultra

-d
 
So did anybody order one? I bit the bullet presuming that the 27" AS iMac is either no more or a long way out. Got the studio display and the base ultra

-d
At least one other person here did. Looking forward to your reviews!
 
The reviews of the Mac Studio are out. I have no idea why Apple claimed that the M1 Ultra was close to the 3090 and apparently neither has any reviewer. Too bad Andrei is no longer in AnandTech.
 
The reviews of the Mac Studio are out. I have no idea why Apple claimed that the M1 Ultra was close to the 3090 and apparently neither has any reviewer. Too bad Andrei is no longer in AnandTech.
Over at the other place there's open warfare about this issue. There isn't even agreement about exactly what Apple was trying to portray in its vague performance slides. Notice that Apple is using "relative performance" on the y-axis, instead of simply raw performance. I get the feeling that an engineer was trying to explain a performance/watt advantage in regards to the Ultra relative to the 3090, to someone in marketing, and then marketing ran with it, resulting in this thing. I'm not certain exactly what Apple is trying to communicate here. The GPUs inside the M1 SoC family are already impressive enough as is, particularly considering that it's a first-generation product. So why bring up the 3090 at all, and instead continue to compare them to the previous GPUs inside Intel Macs, which seems to work heavily in their favor?
ultraperf.png


On top of that, as you said, all of the competent reviewers that do in-depth analysis are no longer in tech journalism. That leaves us with short synthetic benchmarks like Geekbench which have little real-world utility, including compute results using the now defunct OpenCL, GFXBench off-screen tests which have similar issues, Cinebench in which Apple Silicon doesn't appear to be utilized to its full potential, and then of course gaming benchmarks which are considerably hindered because they are running under Rosetta 2. These benchmarks may be useful when doing a quick comparison between chips in the same family, but trying to divine anything meaningful when compared to a different platform is questionable, at best. Heck, at this point using a handheld stopwatch while running the latest beta of Baldur's Gate 3 would be more useful when evaluating the graphics performance of the M1 Ultra when compared to what has been presented thus far, because that's at least a real world game that is ARM native and optimized for Apple Silicon.

Unfortunately, the situation isn't going to get any better without someone to step up like Andrei or Ian, who both left Anandtech recently. I'm sure most of us remember how the tech press and PC partisans lambasted the A15 for being a warmed over A14, until Andrei proved otherwise, demonstrating that the A15 was in fact a substantial improvement over the previous generation. I'm going to defer to @Cmaier in his belief that Apple has actually been underselling the performance of Apple Silicon. However, when it comes to marketing the Ultra, Apple hasn't done itself any favors with these vague graphs. Maybe they're useful during a snazzy presentation for Tim Cook and his lieutenants to show off to the general public, but they have little utility for tech literate nerds who are attempting to judge actual performance, rather than "relative performance". At this point we'd have as much success consulting a witch doctor practicing haruspicy while reading chicken entrails.
 
Back
Top