# Mac Studio



## Cmaier

Exclusive: ‘Mac Studio’ is coming – is it the pro Mac mini or mini Mac Pro?
					

9to5Mac has learned that in addition to the a new Mac mini and Mac Pro, Apple has been developing a brand new "Mac Studio" computer.




					9to5mac.com


----------



## Runs For Fun

I wonder if this is that rumored Mac Pro Mini.


----------



## Pumbaa

Should I get one to replace my Mini, I’ll probably name it “Big Mac”.


----------



## Colstan

It sounds like the fabled xMac that Apple customers have wanted for the past twenty years. It's going to look great right alongside the PowerBook G5.


----------



## B01L

Dude posted about the Mac Studio just over 24 hours before 9to5 Mac dropped their article...









						Next workstation-class mac will be called Mac Studio
					

I got some info about next workstation class mac, list below.  - It will be called Mac Studio - Powered by M1 max 2x - 20 CPU cores and 48 GPU cores - Low-end mac display (but the prices not seem to be low) will be call Apple 5K display - As i have seen only front of display, it closely resemble...




					forums.macrumors.com
				




My thoughts on the matter...

Peek performance...

A sneak peek at the new Mac Studio headless desktop system...!

Mac Studio

Mac Studio will replace the high-end 2018 Intel Mac mini, bridging the gap between the (new smaller design) Mn-series Mac mini and the full tower Mac Pro...

Mac Studio will go from a base single M1 Pro SoC model all the way up to a dual M1 Max SoC model:

Base model - 8/14, 16/512, Gigabit Ethernet, $1499
Fully loaded model - 20/64, 128/8T, 10Gb Ethernet, $6999
Four TB4/USB4 (USB-C) ports & two USB 3.1 Gen2 (USB-A) ports on the single SoC models...

Six TB4/USB4 (USB-C) ports & four USB 3.1 Gen2 (USB-A) ports on the dual SoC models...

Gigabit Ethernet standard, upgrade to 10Gb Ethernet is US$100...


*SoC**CPU**P/E**GPU**RAM**SSD**Ethernet*M1 Pro8-core6P/2E14-core16GB/32GB512GB/1TB/2TB/4TB/8TBGigabit/10GbM1 Pro10-core8P/2E14-core16GB/32GB512GB/1TB/2TB/4TB/8TBGigabit/10GbM1 Pro10-core8P/2E16-core16GB/32GB512GB/1TB/2TB/4TB/8TBGigabit/10GbM1 Max10-core8P/2E24-core32GB/64GB512GB/1TB/2TB/4TB/8TBGigabit/10GbM1 Max10-core8P/2E32-core32GB/64GB512GB/1TB/2TB/4TB/8TBGigabit/10GbDual M1 Max20-core16P/4E48-core64GB/128GB512GB/1TB/2TB/4TB/8TBGigabit/10GbDual M1 Max20-core16P/4E64-core64GB/128GB512GB/1TB/2TB/4TB/8TBGigabit/10Gb


----------



## Yoused

Just for the hell of it


Spoiler: a fanciful rough sketch of how I might design a Mac Studio






Mbd is vertical; some ports in back, some USB ports and SD slot under front lip; can serve as a monitor stand, with a securing bar from the top edge of the monitor to the top edge of the Mac; may provide for the option of adding a PCIe card chassis in back.

This is obviously kind of goofy, but a similar concept might make some kind of elegant sense.


----------



## B01L

Yoused said:


> Just for the hell of it
> 
> 
> Spoiler: a fanciful rough sketch of how I might design a Mac Studio
> 
> 
> 
> View attachment 12222​Mbd is vertical; some ports in back, some USB ports and SD slot under front lip; can serve as a monitor stand, with a securing bar from the top edge of the monitor to the top edge of the Mac; may provide for the option of adding a PCIe card chassis in back.
> 
> This is obviously kind of goofy, but a similar concept might make some kind of elegant sense.




No thank you, Cube 2.0 please...! ;^p


----------



## Yoused

B01L said:


> No thank you, Cube 2.0 please...! ;^p



I have a Cube, and it is _way_ too massive for an M-series Mac.


----------



## B01L

Yoused said:


> I have a Cube, and it is _way_ too massive for an M-series Mac.




Reports on the 2021 ASi MBP laptops say that, with CPU & GPU pegged as much as possible, even the 16" will thermal throttle; it can handle pegging one or the other of the systems, but not both at once...

If these Mac Studio machines are supposed to be for DCC working folk, both CPU & GPU will probably be getting regular workouts, so the more cooling capabilities the better...?

Dual M1 Max SoCs with a 2019 Mac Pro-style heat sink (wide-spaced fat fins) with a pair of 180mm x 30mm high static pressure fans above & below; vertical double-sided mobo (SSD blades on backside) up one side of chassis, Mac mini-style PSU up other side; enough thermal headroom if Apple decides to drop a quad SoC configuration in there at some point...?


----------



## Yoused

B01L said:


> Reports on the 2021 ASi MBP laptops say that, with CPU & GPU pegged as much as possible, even the 16" will thermal throttle; it can handle pegging one or the other of the systems, but not both at once...
> 
> If these Mac Studio machines are supposed to be for DCC working folk, both CPU & GPU will probably be getting regular workouts, so the more cooling capabilities the better...?
> 
> Dual M1 Max SoCs with a 2019 Mac Pro-style heat sink (wide-spaced fat fins) with a pair of 180mm x 30mm high static pressure fans above & below; vertical double-sided mobo (SSD blades on backside) up one side of chassis, Mac mini-style PSU up other side; enough thermal headroom if Apple decides to drop a quad SoC configuration in there at some point...?



Then the Cube makes even _less_ sense. If you are concerned about thermals, you want the least volumetric aspect ratio you can get, not, like the Cube, the least surface area relative to volume.


----------



## B01L

But I am a SFF aficionado & love the G4 Cube (I have worked on them but I do not have one), so Mac Studio = Cube 2.0...! ;^p


----------



## B01L

Looks to be a 4L powerhouse...!


----------



## tomO2013

The other place has reported  that there are three problems with existing apple silicon mbp solutions that need to be solved for professionals…

1.  will the  latest mac studio retake the crown from alder lake in stockshrimp chess benchmark?

2.  will it be able to run cinebench unoptimized faster than optimized optix code path on a 800w+ custom overclocked x86 gaming pc?

3. will it support as many rgb customizations as an Asus ROG Strix setup?

If it cannot do any of those things, then it will fail spectacularly and potential purchasers will just buy an Xbox series x with game pass instead.

Love,
The other place apple silicon forum  bashing brigade.


----------



## Chew Toy McCoy

I hope it’s another Mac where you can’t upgrade the drive or RAM later and they charge you at least double the market rate for increasing those options on your initial purchase. I don’t care what they are telling investors on those earning calls. 90% of their profits is extreme gouging people on drives and RAM.

I have about 18 trillion rubles burning a hole in my pocket right now but the way things are looking I might have to save a bit longer or just blow the whole wad on one AirTag.


----------



## DT

@tomO2013 That's hysterical, I'm sure one of halfwits charging into that battle is mi7chy ...


----------



## Andropov

Leaked M1 Ultra Geekbench. 1793 Single Core, 24055 Multicore. 





__





						Mac13,2  - Geekbench Browser
					

Benchmark results for a Mac13,2 with an Apple M1 Ultra processor.



					browser.geekbench.com
				




Take with a grain of salt, obviously, but seems reasonable.


----------



## Cmaier

DT said:


> @tomO2013 That's hysterical, I'm sure one of halfwits charging into that battle is mi7chy ...




Believe he was suspended today


----------



## Cmaier

Andropov said:


> Leaked M1 Ultra Geekbench. 1793 Single Core, 24055 Multicore.
> 
> 
> 
> 
> 
> __
> 
> 
> 
> 
> 
> Mac13,2  - Geekbench Browser
> 
> 
> Benchmark results for a Mac13,2 with an Apple M1 Ultra processor.
> 
> 
> 
> browser.geekbench.com
> 
> 
> 
> 
> 
> Take with a grain of salt, obviously, but seems reasonable.




Pretty much what one would predict.


----------



## Deleted member 215

It's dope af, but without a mini-LED ProMotion monitor to go along with it, can't justify the expense. I'm sure a monitor like that will be coming in the next few years. Right now I'm happy with my MBP (I better be, since it's the most expensive Mac I've ever bought).


----------



## januarydrive7

Cmaier said:


> Believe he was suspended today



I believe I'm on the verge of suspension.  Have received 3 PMs from mods today


----------



## mr_roboto

januarydrive7 said:


> I believe I'm on the verge of suspension.  Have received 3 PMs from mods today



MR's moderation policy is so dumb.  They should've cleaned out obvious bad-faith trolls like mi7chy long ago, but refuse to, apparently out of some insane notion that permitting trolling is how you prevent forums from becoming echo chambers.

Since they coddle trolls, that means there's always little outbreaks of drama.  They compound this by instinctively deleting posts.  This is a bad way to moderate.  For all its faults (and there are many), one of the things the Something Awful forums (for those not familiar, one of the earliest big forums on the internet) got right about moderation is that for the most part, bad posts are left intact.  They just have the mod who took action on the post edit in a big USER WAS PROBATED/BANNED FOR THIS POST notice.  It's a very simple and effective method of communicating to the community "see this? This is what we don't want here".  Disappearing everything just means nobody really knows what will or won't get them in trouble.


----------



## Andropov

Also very not cool to edit someone's posts to remove parts of them that relate to conversations that they've chosen to delete, as they sometimes do. I hadn't seen that in any other forum, ever, and I find it quite infuriating. Either delete my posts or don't delete them, but ffs don't edit them to remove just certain parts of it. Who on earth thinks that's a good idea?


----------



## januarydrive7

mr_roboto said:


> MR's moderation policy is so dumb.  They should've cleaned out obvious bad-faith trolls like mi7chy long ago, but refuse to, apparently out of some insane notion that permitting trolling is how you prevent forums from becoming echo chambers.



See, I'm not against trolling per se --- I even find some of the trolling entertaining, and oftentimes find that trolled threads get lots of great, enlightening responses from those smarter than I (many of you have contributed to that).

The issue is that when someone is obviously trolling, you're not allowed to remotely suggest that trolling is taking place, or else you're insulting someone.


mr_roboto said:


> Since they coddle trolls, that means there's always little outbreaks of drama.  They compound this by instinctively deleting posts.  This is a bad way to moderate.  For all its faults (and there are many), one of the things the Something Awful forums (for those not familiar, one of the earliest big forums on the internet) got right about moderation is that for the most part, bad posts are left intact.  They just have the mod who took action on the post edit in a big USER WAS PROBATED/BANNED FOR THIS POST notice.  It's a very simple and effective method of communicating to the community "see this? This is what we don't want here".  Disappearing everything just means nobody really knows what will or won't get them in trouble.



This seems great, but would never fly at TOP.  Revenues from high-traffic means that moderation needs to adapt to the soup de jour; setting an actual plumb line of what is acceptable doesn't allow micro-adjustments of moderation to optimize traffic.


----------



## mr_roboto

januarydrive7 said:


> See, I'm not against trolling per se --- I even find some of the trolling entertaining, and oftentimes find that trolled threads get lots of great, enlightening responses from those smarter than I (many of you have contributed to that).
> 
> The issue is that when someone is obviously trolling, you're not allowed to remotely suggest that trolling is taking place, or else you're insulting someone.



I actually agree, but if you're going to be heavyhanded and clamp down on fights like MR mods do, you have to be actually fair about why the fights happened, and that means banning the trolls.  Instead, they let trolls get away with their bullshit as long as they maintain the thinnest veneer of pseudo-civility.

Basically, I'd prefer a trolls allowed + appropriate mocking allowed vibe, with mods stepping in to ban the worst or most boring and repetitive trolls. Failing that, I want no trolls allowed.  Instead it's a weird place where the mods often help the trolls out, and even when they don't, you have to walk on eggshells when responding to trolls.  It sucks a lot.


----------



## quagmire

So how about them Mac Studio's?


----------



## Cmaier

quagmire said:


> So how about them Mac Studio's?




Pixar’s going to buy a lot of ‘em, I bet.


----------



## Eric

quagmire said:


> So how about them Mac Studio's?



You've been working to keep things on topic this week I'll give you that.

It's a great machine and tempting, last year I got the new M1 MBP and it's waaay faster than my 27" iMac, just not sure I can justify the expense at this point but I'll take another look next year.


----------



## chengengaun

Is it outrageous to say that Mac Studio delivers very good value for money (and performance for energy) for those who can make use of it? You can buy the whole system for not much more than the Xeon 8160 when new (and I use at work). And it's even more insane to think that the whole Mac can sit on the desk (or in the drawer), not some 2U rack unit. I'd be even more worried if I am Intel (if Intel is not worried and feeling insecure already).


----------



## Cmaier

chengengaun said:


> Is it outrageous to say that Mac Studio delivers very good value for money (and performance for energy) for those who can make use of it? You can buy the whole system for not much more than the Xeon 8160 when new (and I use at work). And it's even more insane to think that the whole Mac can sit on the desk (or in the drawer), not some 2U rack unit. I'd be even more worried if I am Intel (if Intel is not worried and feeling insecure already).




But alder lake or something! ;-)


----------



## Yoused

Cmaier said:


> But alder lake or something! ;-)



nVidia will kill them. Right?

One thing that confuses me is GPU "cores". The info on the nVidia 30 series says that it has something like 18K+ cores, but the M1 has 7 in the lowest-bin, up to 64 in the top-bin Ultra. What exactly is the linguistic difference here? Is an RTX "core" more like an EU in the M-series? And how does the Neural Engine factor into the difference?


----------



## Nycturne

Yoused said:


> One thing that confuses me is GPU "cores". The info on the nVidia 30 series says that it has something like 18K+ cores, but the M1 has 7 in the lowest-bin, up to 64 in the top-bin Ultra. What exactly is the linguistic difference here? Is an RTX "core" more like an EU in the M-series? And how does the Neural Engine factor into the difference?




You aren’t the only one. While I could very well be wrong, when looking at something like the CUDA core count on nVidia these are more analogous to the count of ALUs in a CPU. nVidia is counting individual processor units in a lot of their marketing, rather than the clusters of processor units (SM units). AMD and Apple use the count of clusters, labeling them as compute units (AMD) or cores (Apple). Even then they aren’t directly comparable, unless talking about different models in the same lineup. 

Thankfully, reviewers tend to focus more on throughput.

As for the neural engine, my understanding is that it is analogous to the tensor cores.


----------



## Andropov

Nycturne said:


> As for the neural engine, my understanding is that it is analogous to the tensor cores.



Speaking of which... the M1 Ultra now has two fully unused Neural Engines.


----------



## mr_roboto

Andropov said:


> Speaking of which... the M1 Ultra now has two fully unused Neural Engines.



Yeah, I'd love to know why they chose to keep the extra Neural Engines dark silicon.  Doesn't even appear to be about yield, as people figured out how to tell which of the two neural engines is the live one on the M1 Max die, and (admittedly in an informal twitter survey) everyone seems to have the exact same one.


----------



## Andropov

mr_roboto said:


> Yeah, I'd love to know why they chose to keep the extra Neural Engines dark silicon.  Doesn't even appear to be about yield, as people figured out how to tell which of the two neural engines is the live one on the M1 Max die, and (admittedly in an informal twitter survey) everyone seems to have the exact same one.



Maybe this is a good theory?
https://www.twitter.com/i/web/status/1477678685935312899/

Maybe there's just no way (yet) to balance the load between the two Neural Engines, or they're working on the API to do it.


----------



## jbailey

So I could replace my 2013 Mac Pro with the 48 GPU core 20 CPU core M1 Ultra Mac Studio with 2 TB and 64 GB for $4399. This compares pretty favorably to the price I paid for a 3.5 GHz 6 core Xeon 16 GB/512GB dual AMD FirePro D500 Mac Pro in 2014. That was $4299 but over the years I've upgraded to 64 GB and 2 TB SSD (I do not remember what those upgrades cost but it was certainly more than $100).

I would lose x86-64 VM compatibility but the Mac Studio might be a little faster. (GB5 864/5036 vs. 1793/24055).


----------



## B01L

Cmaier said:


> Pixar’s going to buy a lot of ‘em, I bet.




AFAIK, Pixar uses a Linux-based pipeline...?



Yoused said:


> One thing that confuses me is GPU "cores". The info on the nVidia 30 series says that it has something like 18K+ cores, but the M1 has 7 in the lowest-bin, up to 64 in the top-bin Ultra. What exactly is the linguistic difference here? Is an RTX "core" more like an EU in the M-series? And how does the Neural Engine factor into the difference?




Multiply the ASi GPU core count by 64 to derive like numbers...


----------



## Yoused

B01L said:


> Multiply the ASi GPU core count by 64 to derive like numbers...



Maybe not so much. For the high-end Ultra, that would mean the equivalent of 4K "cores", which is a very long way from the 18K number I was seeing for some nVidia cards. Although, the power usage on those cards alone is in the 300W range, which is probably quite a bit higher than the peak draw of an entire Ultra SoC.


----------



## januarydrive7

mr_roboto said:


> Yeah, I'd love to know why they chose to keep the extra Neural Engines dark silicon.  Doesn't even appear to be about yield, as people figured out how to tell which of the two neural engines is the live one on the M1 Max die, and (admittedly in an informal twitter survey) everyone seems to have the exact same one.



I was wondering if this might be a "we left a secret in the M1 Max/Ultra" when they introduce Mac Pro.


----------



## sgtaylor5

The Mac Studio Display comes in three different models, differentiated by the stand. The stands are non-removable from the display. The first is a no extra cost tilt only stand, or a VESA mount model. The tilt and height adjustable stand costs $399 extra.

Wonder how many returns Apple is going to get when people figure this all out?

https://www.macrumors.com/2022/03/09/studio-display-stands-not-interchangeable/


----------



## B01L

Yoused said:


> Maybe not so much. For the high-end Ultra, that would mean the equivalent of 4K "cores", which is a very long way from the 18K number I was seeing for some nVidia cards. Although, the power usage on those cards alone is in the 300W range, which is probably quite a bit higher than the peak draw of an entire Ultra SoC.




And those massive GPU dies are the size of a M1 Max or more, but no pesky CPU cores, or Neural Engine cores, or Media Engines, or all that stuff to take room away from more CUDA cores...


----------



## Cmaier

Maybe hold off buying that display…


----------



## Runs For Fun

Cmaier said:


> View attachment 12336
> 
> Maybe hold off buying that display…



Sonofabitch


----------



## Colstan

Cmaier said:


> Maybe hold off buying that display…



The rumors have been all over the place. Unlike the PC side, you need a degree in Kremlinology to divine Apple's intentions. I wonder if this is the rumored 7K display, alleged by 9to5Mac, that will supposedly replace the XDR, which Ross doesn't believe it to be, or the monitor that sits between the Studio Display and XDR which was rumored to be $2,500. Regardless, unless you need a new monitor immediately, it might be wise to wait until after WWDC.


----------



## Cmaier

Colstan said:


> The rumors have been all over the place. Unlike the PC side, you need a degree in Kremlinology to divine Apple's intentions. I wonder if this is the rumored 7K display, alleged by 9to5Mac, that will supposedly replace the XDR, which Ross doesn't believe it to be, or the monitor that sits between the Studio Display and XDR which was rumored to be $2,500. Regardless, unless you need a new monitor immediately, it might be wise to wait until after WWDC.



Yep. It may be a $2500 display with promotion and miniLED, which makes for a very interesting product. In my case that would be too big a price jump for those features, but, then, I’m not a pro in an industry where I do video or graphics work.


----------



## Andropov

januarydrive7 said:


> I was wondering if this might be a "we left a secret in the M1 Max/Ultra" when they introduce Mac Pro.



It's not very secret by now.



sgtaylor5 said:


> The Mac Studio Display comes in three different models, differentiated by the stand. The stands are non-removable from the display. The first is a no extra cost tilt only stand, or a VESA mount model. The tilt and height adjustable stand costs $399 extra.
> 
> Wonder how many returns Apple is going to get when people figure this all out?
> 
> https://www.macrumors.com/2022/03/09/studio-display-stands-not-interchangeable/



Older iMacs and Apple displays could change between VESA and built-in stands. Bit of a shame they removed that.


----------



## Nycturne

Yoused said:


> Maybe not so much. For the high-end Ultra, that would mean the equivalent of 4K "cores", which is a very long way from the 18K number I was seeing for some nVidia cards. Although, the power usage on those cards alone is in the 300W range, which is probably quite a bit higher than the peak draw of an entire Ultra SoC.




I’m not entirely sure where you got the 18K number for Nvidia. The 3090 claims 10496 CUDA cores, and the A6000 isn’t much bigger than that.  https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090/

@leman at tOP said:


> An Apple GPU core contains 4x 32-wide ALUs, for 128 ALUs or “shader cores/units” in total. How exactly these work and whether each ALU can execute different instruction stream is not clear as far as I know. The Ultra will therefore contain 8192 “shader cores/units” and should support close to 200000 threads in flight.




While not getting there the same way, the math matches what AnandTech guessed at the time of the M1 release (since Apple does not share details): https://www.anandtech.com/show/16252/mac-mini-apple-m1-tested/3

So it’s not 4K vs 18K as far as I can see, but ~8K vs ~10K.


----------



## DT

There a couple of fucknuts on MR that literally spent all day in that display thread, not contributing in any meaningful way, simply "liking/loving" every single post critical of the design, from morning, till evening.

Wow. 

Don't get me wrong, I think that's a less than optimal design for the stand/mount, in an otherwise stellar looking product.  One of the premium accessory manufacturers like Twelve South, should build a VESA compatible stand, that's matches the design.

 Anyway,  that's about all I have to say, and it took me a couple of minutes, one time vs. my __whole__freaking__day__.


----------



## Yoused

Nycturne said:


> I’m not entirely sure where you got the 18K number for Nvidia.



It is possible that I was seeing some leaks/speculation on the 40xx series and conflated that with the 30xx models.


----------



## mr_roboto

Andropov said:


> Maybe there's just no way (yet) to balance the load between the two Neural Engines, or they're working on the API to do it.



M1 Ultra's specs show it as having 32-core Neural Engine, but since it's composed of two M1 Max, we know it really has 64 ANE cores in four clusters of 16 per cluster.  If they needed to do any work on load balancing or APIs, it's already done.  That means if they were going to enable all the clusters, they could've done so already.


----------



## januarydrive7

Andropov said:


> It's not very secret by now.
> 
> 
> Older iMacs and Apple displays could change between VESA and built-in stands. Bit of a shame they removed that.



The interconnect at the bottom of the M1 Max wasn't exactly a secret, either (although it was less obvious, as Apple's die shots cut it off entirely).


----------



## Deleted member 215

Cmaier said:


> Yep. It may be a $2500 display with promotion and miniLED, which makes for a very interesting product. In my case that would be too big a price jump for those features, but, then, I’m not a pro in an industry where I do video or graphics work.




$2500 might be a low estimate considering the 32" XDR display is $5000 and this one would have a 12 MP webcam and speakers. But maybe the price of the XDR display will be coming down soon...


----------



## Cmaier

TBL said:


> $2500 might be a low estimate considering the 32" XDR display is $5000 and this one would have a 12 MP webcam and speakers. But maybe the price of the XDR display will be coming down soon...




I think the price will come down.  By the time you add stand and nanocoating you're at $3500 anyway.


----------



## Andropov

mr_roboto said:


> M1 Ultra's specs show it as having 32-core Neural Engine, but since it's composed of two M1 Max, we know it really has 64 ANE cores in four clusters of 16 per cluster.  If they needed to do any work on load balancing or APIs, it's already done.  That means if they were going to enable all the clusters, they could've done so already.



Good point.


----------



## dugbug

So did anybody order one?  I bit the bullet presuming that the 27" AS iMac is either no more or a long way out.  Got the studio display and the base ultra

-d


----------



## Cmaier

dugbug said:


> So did anybody order one?  I bit the bullet presuming that the 27" AS iMac is either no more or a long way out.  Got the studio display and the base ultra
> 
> -d



At least one other person here did. Looking forward to your reviews!


----------



## dugbug

Cmaier said:


> At least one other person here did. Looking forward to your reviews!




mid april unfortunately

I’ll be sure to run stockfish benchmarks lol


----------



## Cmaier

dugbug said:


> mid april unfortunately
> 
> I’ll be sure to run stockfish benchmarks lol



Good! Random chess benchmarks are obviously the most important thing


----------



## Andropov

The reviews of the Mac Studio are out. I have no idea why Apple claimed that the M1 Ultra was close to the 3090 and apparently neither has any reviewer. Too bad Andrei is no longer in AnandTech.


----------



## Colstan

Andropov said:


> The reviews of the Mac Studio are out. I have no idea why Apple claimed that the M1 Ultra was close to the 3090 and apparently neither has any reviewer. Too bad Andrei is no longer in AnandTech.



Over at the other place there's open warfare about this issue. There isn't even agreement about exactly what Apple was trying to portray in its vague performance slides. Notice that Apple is using "relative performance" on the y-axis, instead of simply raw performance. I get the feeling that an engineer was trying to explain a performance/watt advantage in regards to the Ultra relative to the 3090, to someone in marketing, and then marketing ran with it, resulting in this thing. I'm not certain exactly what Apple is trying to communicate here. The GPUs inside the M1 SoC family are already impressive enough as is, particularly considering that it's a first-generation product. So why bring up the 3090 at all, and instead continue to compare them to the previous GPUs inside Intel Macs, which seems to work heavily in their favor?



On top of that, as you said, all of the competent reviewers that do in-depth analysis are no longer in tech journalism. That leaves us with short synthetic benchmarks like Geekbench which have little real-world utility, including compute results using the now defunct OpenCL, GFXBench off-screen tests which have similar issues, Cinebench in which Apple Silicon doesn't appear to be utilized to its full potential, and then of course gaming benchmarks which are considerably hindered because they are running under Rosetta 2. These benchmarks may be useful when doing a quick comparison between chips in the same family, but trying to divine anything meaningful when compared to a different platform is questionable, at best. Heck, at this point using a handheld stopwatch while running the latest beta of Baldur's Gate 3 would be more useful when evaluating the graphics performance of the M1 Ultra when compared to what has been presented thus far, because that's at least a real world game that is ARM native and optimized for Apple Silicon.

Unfortunately, the situation isn't going to get any better without someone to step up like Andrei or Ian, who both left Anandtech recently. I'm sure most of us remember how the tech press and PC partisans lambasted the A15 for being a warmed over A14, until Andrei proved otherwise, demonstrating that the A15 was in fact a substantial improvement over the previous generation. I'm going to defer to @Cmaier in his belief that Apple has actually been underselling the performance of Apple Silicon. However, when it comes to marketing the Ultra, Apple hasn't done itself any favors with these vague graphs. Maybe they're useful during a snazzy presentation for Tim Cook and his lieutenants to show off to the general public, but they have little utility for tech literate nerds who are attempting to judge actual performance, rather than "relative performance". At this point we'd have as much success consulting a witch doctor practicing haruspicy while reading chicken entrails.


----------



## jbailey

First video tear down of the M1 Ultra Mac Studio. From Max Tech. Very cool. And yes the SSDs are socketed.


----------



## Roller

dugbug said:


> mid april unfortunately
> 
> I’ll be sure to run stockfish benchmarks lol



I may be the other person @Cmaier referred to, but also looking at a mid-April delivery for the Mac Studio and 1 week later for the Studio Display.

I won't be at all surprised if Apple announces a higher-end monitor with mini LED and ProMotion around WWDC, but I can't imagine it'll be < $3500 - $4000 given the cost for the Pro Display XDR. Even if the price for the PD comes down, the new display will be more expensive than I'm comfortable with. 

I keep looking for cheaper alternatives to the Studio Display, but am not optimistic. I don't know if this is true for other people, but with my vision I need a high-res monitor that renders text sharply even at scaled resolutions and also lets me zoom in and out as needed. High brightness is also a plus. I spend much more time working at home than ever, and I'm willing to pay a premium for these features.


----------



## Citysnaps

Roller said:


> I don't know if this is true for other people, but with my vision I need a high-res monitor that renders text sharply even at scaled resolutions and also lets me zoom in and out as needed. High brightness is also a plus. I spend much more time working at home than ever, and I'm willing to pay a premium for these features.




Same here. I immediately noticed the difference on my 2017 27" 5K iMac. And can never go back to a lesser display on a desktop Mac.


----------



## Andropov

jbailey said:


> First video tear down of the M1 Ultra Mac Studio. From Max Tech. Very cool. And yes the SSDs are socketed.




Does that massive rectangle of exposed copper from the motherboard surrounding the SoC have a purpose (other than looking great)? EMI shielding the rest of the board from the SoC?

Also wow, great build quality.


----------



## Cmaier

Andropov said:


> Does that massive rectangle of exposed copper from the motherboard surrounding the SoC have a purpose (other than looking great)? EMI shielding the rest of the board from the SoC?
> 
> Also wow, great build quality.




Hard to say. I’d have guessed thermal, but EMI works too - from the video it seems what you’ve got there is essentially a faraday cage, whether that was the goal or not.

Normally, though, it’s the board, not the chip, where you run into EMI problems.


----------



## throAU

Yoused said:


> Maybe not so much. For the high-end Ultra, that would mean the equivalent of 4K "cores", which is a very long way from the 18K number I was seeing for some nVidia cards. Although, the power usage on those cards alone is in the 300W range, which is probably quite a bit higher than the peak draw of an entire Ultra SoC.




Yeah Nvidia’s core counts are huge but if you conmpare to AMD, AMD is still 64 cores per CU.  Hence their top 80 CU part has 5120 cores; about half of Nvidia’s 3090 but in some cases similar performance.

Apples/Oranges - what matters is throughput, not core count.


----------



## Yoused

I was just reading this bit on the SSD sockets and how storage is not straight-up swappable due to encryption. It made me wonder, who here has made an external boot drive for an AS Mac? How difficult is it?


----------



## Nycturne

Yoused said:


> I was just reading this bit on the SSD sockets and how storage is not straight-up swappable due to encryption. It made me wonder, who here has made an external boot drive for an AS Mac? How difficult is it?




I’d probably look at eclecticlight’s articles about how it all works. It’s doable (especially if you avoid the early Big Sur releases), but it still requires the internal SSD to be functional to boot to the external drive.


----------



## Entropy

Roller said:


> I may be the other person @Cmaier referred to, but also looking at a mid-April delivery for the Mac Studio and 1 week later for the Studio Display.
> 
> I won't be at all surprised if Apple announces a higher-end monitor with mini LED and ProMotion around WWDC, but I can't imagine it'll be < $3500 - $4000 given the cost for the Pro Display XDR. Even if the price for the PD comes down, the new display will be more expensive than I'm comfortable with.
> 
> I keep looking for cheaper alternatives to the Studio Display, but am not optimistic. I don't know if this is true for other people, but with my vision I need a high-res monitor that renders text sharply even at scaled resolutions and also lets me zoom in and out as needed. High brightness is also a plus. I spend much more time working at home than ever, and I'm willing to pay a premium for these features.



I’m probably pairing my Studio with a Lenovo P27u-20 27” 4k monitor. Excellent colour gamut, thunderbolt/displayport/hdmi inputs and acts like a hub (even Ethernet, not that the Studio needs it but attached laptops might),”mini”-LED backlight. Huawei Mateview 28 is in the running too.
The display market is in a transition to HDR, and neither technology or prices are mature unfortunately. For once, this is a situation where, depending on your interests, you might not buy a screen to do the job for the next decade+.

Edit before replies: Both the above displays offer connectivity to several video sources over several input connector options (the deciding factor for me), and are of course height adjustable without paying another $400 + taxes.


----------



## Joelist

With the recent announcement by Apple that a firmware update will be forthcoming for the Studio Display partly to allow the webcam to fully utilize the DSP and to properly tune it to the display, who also expects a firmware update to the Studio itself? In this case I would think it would be upclocking the CPU and GPU cores plus other items so that the SOC is taking more advantage of the enhanced cooling.


----------



## Cmaier

Joelist said:


> With the recent announcement by Apple that a firmware update will be forthcoming for the Studio Display partly to allow the webcam to fully utilize the DSP and to properly tune it to the display, who also expects a firmware update to the Studio itself? In this case I would think it would be upclocking the CPU and GPU cores plus other items so that the SOC is taking more advantage of the enhanced cooling.




I tend to doubt it? It could be that they tune the throttling mechanisms, but I doubt they’ll raise the top speed.


----------



## Joelist

I was just thinking they might upclock mildly because at present neither the Max nor the Ultra are coming even close to full use of the cooling. Or maybe the overpowered cooling is future proofing against the M2 Max and Ultra?


----------



## Cmaier

Joelist said:


> I was just thinking they might upclock mildly because at present neither the Max nor the Ultra are coming even close to full use of the cooling. Or maybe the overpowered cooling is future proofing against the M2 Max and Ultra?




The issue would be that they probably are already clocking as fast as they feel comfortable - any faster and the die simply might not work.  Chips have a “critical path” through transistors and wires that determines the maximum clock frequency - 1 divided by however long it takes a signal to propagate through that path.  The length of that path (and even which path is the critical path) can vary from chip to chip and wafer to wafer.  So they test everything at a certain speed when it comes off the assembly line, and they wouldn’t every make it run faster than that. 

If the chips had tested good at a higher clock rate, they’d probably already be running at that clock rate.


----------



## Nycturne

Joelist said:


> I was just thinking they might upclock mildly because at present neither the Max nor the Ultra are coming even close to full use of the cooling. Or maybe the overpowered cooling is future proofing against the M2 Max and Ultra?




“Overpowered” cooling is also a way to get quiet cooling.


----------



## Joelist

Cmaier said:


> The issue would be that they probably are already clocking as fast as they feel comfortable - any faster and the die simply might not work.  Chips have a “critical path” through transistors and wires that determines the maximum clock frequency - 1 divided by however long it takes a signal to propagate through that path.  The length of that path (and even which path is the critical path) can vary from chip to chip and wafer to wafer.  So they test everything at a certain speed when it comes off the assembly line, and they wouldn’t every make it run faster than that.
> 
> If the chips had tested good at a higher clock rate, they’d probably already be running at that clock rate.



Point taken. It is likely then part quietness and part futureproofing. With this design they can easily (assuming our assumptions are correct) slide an M2 Max or Ultra into the same system.


----------



## Cmaier

Joelist said:


> Point taken. It is likely then part quietness and part futureproofing. With this design they can easily (assuming our assumptions are correct) slide an M2 Max or Ultra into the same system.



Yep. 

Just saw one in person at the apple store. Bigger than I imagined. No discernible fan noise. 

I imagine in the future the power will go up primarily on the gpu side, and this design can accommodate that.  Not a repeat of the trash can Mac Pro.


----------



## Runs For Fun

So I got mine! I cannot hear it at all unless I put my ear right next to it. Even under load. Currently I’m transcoding all my flac files to alac so I can put them in Apple Music. I love this thing. Paired with the Studio Display it’s beautiful! I’m coming from a 2K@144Hz display so this thing is blowing my mind. I hooked that up as a secondary display and oh my it hurts my eyes now! Once you go 5K you can’t go back!


----------



## Cmaier

Runs For Fun said:


> So I got mine! I cannot hear it at all unless I put my ear right next to it. Even under load. Currently I’m transcoding all my flac files to alac so I can put them in Apple Music. I love this thing. Paired with the Studio Display it’s beautiful! I’m coming from a 2K@144Hz display so this thing is blowing my mind. I hooked that up as a secondary display and oh my it hurts my eyes now! Once you go 5K you can’t go back!




Yeah i just spent 15 minutes staring at three of those monitors at the Apple Store…so tempted.  Let us know more about both purchases as you get more insights.   

BTW - I forget - which CPU did you get?


----------



## Runs For Fun

Cmaier said:


> Yeah i just spent 15 minutes staring at three of those monitors at the Apple Store…so tempted.  Let us know more about both purchases as you get more insights.
> 
> BTW - I forget - which CPU did you get?



I got the base M1 Ultra


----------



## Cmaier

Runs For Fun said:


> I got the base M1 Ultra



That gives “base” a new meaning.

Damn, that thing must be so sweet.  I used to write software at AMD that we used to design the chips. One thing I wrote was called agincourt.  It was what we called a circuit classifier and static checker - it would look at the circuit netlist (the list of transistors and their interconnections) and try to Figure out what each circuit was, and whether it met certain rules.  there was a commercial tool, but it would take all night to run.  I came up with the idea of using tricks from vision recognition, and a partitioning algorithm we filed a patent on, to speed that idea up.  My tool ran in about an hour for the whole chip, and since it could run in parallel (after the partitioning step), we could distribute it on many machines and get it down to 15 minutes. That was on AMD Athlon’s aand SPARC workstations at the time, and the chips we were designing were Opterons, which had only 100 million transistors.  With Ultra we could undoubtedly have done the whole thing in real time as we made edits to the chip, even if the chip was as big as ultra itself.


----------



## Colstan

Cmaier said:


> That gives “base” a new meaning.
> 
> Damn, that thing must be so sweet.



I admit that @Cmaier's enthusiasm is infectious. I'm used to him being strictly logical and rational, while occasionally displaying his trademark acerbic humor and wit. Apple must have done something remarkable if it gets a veteran CPU architect's enthusiastic attention.

Over at "the other place", he compared the time we are now living in to the computer wars of the 80s. Back then you could walk into a software store (yes, they actually existed) which had isles for a half-dozen computer systems, each running different operating systems, with substantially different underlying hardware architectures.

Then things got boring. Microsoft dominated with Windows. The classic Mac OS was religated to 1-2% marketshare mainly used for desktop publishing and graphic arts. Linux was nothing more than a curiosity, at best. RISC designs slowly disappeared from desktops and workstations.

Even as competition improved, CPUs didn't. It wasn't too long ago that Intel had stagnated at 4-cores because AMD wasn't competing, and nobody else challenged them. Apple seemed to have little interest in updating the Mac. I recall TidBITS running an article about how the Mac was quickly becoming a device for older generations, and once those folks essentially died out, the Mac would likely go with them. The Mac would become a legacy product, a side project of the iPhone company.

Apple did the exact opposite, completely revitalizing the Mac with their own custom SoC, new industrial designs, and macOS getting a complete overhaul. AMD is back in the game, forcing Intel to innovate, including getting into graphics cards. New desktop ARM designs are coming from the likes of Qualcomm and Nvidia. Microsoft is trying new and interesting things with Windows.

The traditional desktop computer market hasn't been this exciting in decades, and it's great to see, after experiencing stagnation for so long.


----------



## Andropov

Cmaier said:


> I used to write software at AMD that we used to design the chips. One thing I wrote was called agincourt.  It was what we called a circuit classifier and static checker - it would look at the circuit netlist (the list of transistors and their interconnections) and try to Figure out what each circuit was, and whether it met certain rules.



Agincourt? Any relation to the Battle of Agincourt?



Colstan said:


> Apple did the exact opposite, completely revitalizing the Mac with their own custom SoC, new industrial designs, and macOS getting a complete overhaul.



New APIs to write for macOS too (UIKit on the Mac, SwiftUI...).


----------



## Cmaier

Colstan said:


> I admit that @Cmaier's enthusiasm is infectious. I'm used to him being strictly logical and rational, while occasionally displaying his trademark acerbic humor and wit. Apple must have done something remarkable if it gets a veteran CPU architect's enthusiastic attention.
> 
> Over at "the other place", he compared the time we are now living in to the computer wars of the 80s. Back then you could walk into a software store (yes, they actually existed) which had isles for a half-dozen computer systems, each running different operating systems, with substantially different underlying hardware architectures.
> 
> Then things got boring. Microsoft dominated with Windows. The classic Mac OS was religated to 1-2% marketshare mainly used for desktop publishing and graphic arts. Linux was nothing more than a curiosity, at best. RISC designs slowly disappeared from desktops and workstations.
> 
> Even as competition improved, CPUs didn't. It wasn't too long ago that Intel had stagnated at 4-cores because AMD wasn't competing, and nobody else challenged them. Apple seemed to have little interest in updating the Mac. I recall TidBITS running an article about how the Mac was quickly becoming a device for older generations, and once those folks essentially died out, the Mac would likely go with them. The Mac would become a legacy product, a side project of the iPhone company.
> 
> Apple did the exact opposite, completely revitalizing the Mac with their own custom SoC, new industrial designs, and macOS getting a complete overhaul. AMD is back in the game, forcing Intel to innovate, including getting into graphics cards. New desktop ARM designs are coming from the likes of Qualcomm and Nvidia. Microsoft is trying new and interesting things with Windows.
> 
> The traditional desktop computer market hasn't been this exciting in decades, and it's great to see, after experiencing stagnation for so long.



Remember Egghead? 

The reason I turned down a job at Intel and instead first worked on a PowerPC and then on a sparc was that I really really wanted something other than x86 to be competitive. Sadly I had to give in, because working with good people and enjoying my job actually mattered (and the awesome team for the PowerPC was scattered when we couldn’t get more funding).


----------



## Cmaier

Andropov said:


> Agincourt? Any relation to the Battle of Agincourt?
> 
> 
> New APIs to write for macOS too (UIKit on the Mac, SwiftUI...).



Yep. My boss was Belgian and a complete tool, and the name annoyed him. Agincourt was scriptable with Perl and provided a c++ sdk. All the class names were prefaced with bs_ which he assumed meant bullshit but actually meant “Belgium sucks.”

On the first day he met me when he joined after i worked there for years, he said “I hear you’re very indispensable around here”

I blushed and said thank you.

He continued “the graveyards of Europe are filled with the corpses of indispensable people.”

I promptly applied to law school, attended every night, graduated first in my class, and four years later resigned gloriously (the way I should have from macrumors) and now here I am.


----------



## Runs For Fun

So this thing is *fast! *I had to reboot for a 600 some MB iOS update for the Studio Display  
Upon logging back in all of my apps started simultaneously. I literally had a bunch of windows open at the exact same time.  I thought the M1 MacBook Air was fast


----------



## Cmaier

Runs For Fun said:


> So this thing is *fast! *I had to reboot for a 600 some MB iOS update for the Studio Display
> Upon logging back in all of my apps started simultaneously. I literally had a bunch of windows open at the exact same time.  I thought the M1 MacBook Air was fast




That is awesome.


----------



## Hrafn

Cmaier said:


> Remember Egghead?



Yes, I loved looking at the possibilities of software I couldn't afford to buy.  I used to buy from Egghead.com when you had to send POP, UPCs and receipts for the full discount, and I still get adds for hardware from them.  Amazon now, though, same as Woot!


----------



## Roller

Got a shipping notice for my Mac Studio today — should have it next week. The bad news is that the Studio Display won't arrive until at least 9 days later. And I won't be able to return the iMac I'm trading in until I get the display and transfer everything over, which may be near or beyond the end of Apple's two-week return period. Apple support said to call them if that happens.


----------



## Cmaier

Roller said:


> Got a shipping notice for my Mac Studio today — should have it next week. The bad news is that the Studio Display won't arrive until at least 9 days later. And I won't be able to return the iMac I'm trading in until I get the display and transfer everything over, which may be near or beyond the end of Apple's two-week return period. Apple support said to call them if that happens.




Awkward.


----------



## Runs For Fun

Roller said:


> Got a shipping notice for my Mac Studio today — should have it next week. The bad news is that the Studio Display won't arrive until at least 9 days later. And I won't be able to return the iMac I'm trading in until I get the display and transfer everything over, which may be near or beyond the end of Apple's two-week return period. Apple support said to call them if that happens.



Man that sucks.


----------



## DT

Yeah, I would think if the intent is to trade a device for a device replacement, the latter of which might have more than one product/component, they'd give you some X days after the entire order showed up (which I guess is the "call them if that happens" angle)


----------



## Roller

DT said:


> Yeah, I would think if the intent is to trade a device for a device replacement, the latter of which might have more than one product/component, they'd give you some X days after the entire order showed up (which I guess is the "call them if that happens" angle)



When I placed the order, I don’t recall any way to associate the trade-in with the item that would arrive last, but I would have expected Apple to do that. In any case, if I’m nearing the end of the trade-in window and know the display will be late, I’ll tell them to replace the first trade-in with another. Apparently they don’t extend the two weeks you have to return the item. Sorta sucks given how much money I’m dropping on this stuff. 

I’m also not looking forward to the frustration of having a Mac Studio that I can’t use. But not a big deal in the grand scheme of things, for sure.


----------



## DT

Plug it into a TV


----------



## Roller

DT said:


> Plug it into a TV



Yeah, I’ve been thinking of other displays I could use temporarily. I have an old 27” ACD, but I don’t think it’ll work with the Mac Studio.


----------



## DT

Yeah, I think you'd need an adapter.  Does your iMac still support target display mode?  You know, where you can use an iMac as an external display for another machine.


----------



## Roller

DT said:


> Yeah, I think you'd need an adapter.  Does your iMac still support target display mode?  You know, where you can use an iMac as an external display for another machine.



Unfortunately not. My iMac is from 2017. A few years ago, I wanted to use an even older iMac as a display with the 2017 iMac, but Apple limits it to running Catalina or earlier.


----------



## Andropov

Cmaier said:


> On the first day he met me when he joined after i worked there for years, he said “I hear you’re very indispensable around here”
> 
> I blushed and said thank you.
> 
> He continued “the graveyards of Europe are filled with the corpses of indispensable people.”



My first employer's main client was also trying to get rid of 'indispensable people' and 'knowledge wells'. They went from a small team of highly skilled people to a throw bodies at the problem approach. Quality was going down significantly when I left. I always thought that the problems management had dealing with so many people were textbook parallelization problems, but with people instead of threads.



Cmaier said:


> I promptly applied to law school, attended every night, graduated first in my class, and four years later resigned gloriously (the way I should have from macrumors) and now here I am.



Ha! I've been planning on something quite similar myself. I'm applying to med school next year.



Roller said:


> Got a shipping notice for my Mac Studio today — should have it next week. The bad news is that the Studio Display won't arrive until at least 9 days later. And I won't be able to return the iMac I'm trading in until I get the display and transfer everything over, which may be near or beyond the end of Apple's two-week return period. Apple support said to call them if that happens.



I sent my trade-in MacBook Pro a day late  And they received it several days after I shipped it. The trade-in period may not be heavily enforced, the day the 14-day period ended I got a reminder email asking if I was still planning to trade it in. I'd just call Apple if the Studio Display finally arrives out of that period, they'll provide an extension for sure.


----------



## Joelist

Actually a nice little video about the pitfalls of benchmarking - he alludes to the issue Geekbench has on Apple Silicon where the bursty nature of the test makes AS look artificially low.


----------



## Yoused

Andropov said:


> I always thought that the problems management had dealing with so many people were textbook parallelization problems, but with people instead of threads.



Threading is like warp-factors. Performance goes up, but the effort required to make a multithreaded app function well is typically comparabe to the performance improvement. Sometimes greater. There were a few times at work when I thought how much faster I could this done if I had another of me – until I imagined how difficult I would be to get along with.


----------



## Andropov

Yoused said:


> Threading is like warp-factors. Performance goes up, but the effort required to make a multithreaded app function well is typically comparabe to the performance improvement. Sometimes greater. There were a few times at work when I thought how much faster I could this done if I had another of me – until I imagined how difficult I would be to get along with.



Some projects can be parallelized with relative ease and is something worth doing. Some don't. The problem is, if you have 40 people in a project, you're gonna need some senior devs with a good understanding of the project, managing who does what and how. Otherwise you end up with a frankenstein project, where different parts of the app have different architectures, or duplicated code that does the same thing in different ways (since it was not known that the other code existed at all). And you have zero visibility on what other people are doing. You can set up sync meetings for small teams, but it's unmanageable for teams of dozens of people. And then the senior devs in charge of directing the project burn out, or change jobs, and management just adds more juniors at the base of the project.


----------



## Nycturne

Andropov said:


> Some projects can be parallelized with relative ease and is something worth doing. Some don't. The problem is, if you have 40 people in a project, you're gonna need some senior devs with a good understanding of the project, managing who does what and how. Otherwise you end up with a frankenstein project, where different parts of the app have different architectures, or duplicated code that does the same thing in different ways (since it was not known that the other code existed at all). And you have zero visibility on what other people are doing. You can set up sync meetings for small teams, but it's unmanageable for teams of dozens of people. And then the senior devs in charge of directing the project burn out, or change jobs, and management just adds more juniors at the base of the project.




I'd be laughing if this wasn't so accurate. 

The frankenstein projects I see a lot of are the ones full of half-completed transitions.


----------



## MEJHarrison

I'm getting ready to order the 24-core M1 Max version.  I was set on 1TB.  Then I noticed that my current 1TB is 80% full.  I looked into external SSDs, but nothing can come close to Apple's SSD speeds.  So I might just suck it up and get a 2TB model.


----------



## bunnspecial

Roller said:


> Yeah, I’ve been thinking of other displays I could use temporarily. I have an old 27” ACD, but I don’t think it’ll work with the Mac Studio.




I used one for a while with my M1 MBP(and then went a step further and started using the same size Thunderbolt display with it). 

This is what I initially used https://www.amazon.com/AllSmartLife-DisplayPort-Aluminium-resolution-ChromeBook/dp/B017TZTMBG (although I later switched to using a TB hub/dock that had mini-DP out, then as I mentioned ditched ACD entirely for the near identical TB display...)


----------



## Roller

bunnspecial said:


> I used one for a while with my M1 MBP(and then went a step further and started using the same size Thunderbolt display with it).
> 
> This is what I initially used https://www.amazon.com/AllSmartLife-DisplayPort-Aluminium-resolution-ChromeBook/dp/B017TZTMBG (although I later switched to using a TB hub/dock that had mini-DP out, then as I mentioned ditched ACD entirely for the near identical TB display...)



Thanks. I got my Mac Studio today and connected it to an HP 27” monitor from work. It’ll do until my Studio Display arrives in 2-3 weeks. I’m transferring stuff from backup and will then reinstall third-party apps. I’ll post first impressions when I’ve had a day or so to put it through its paces.


----------



## Roller

A few preliminary observations:

Setup went much faster than expected. I copied everything but apps from a Carbon Copy Cloner backup, which took ~80 minutes for ~700 GB. Next, I reinstalled all my apps working off a list I prepared in advance. Downloading and running installers was quick, too. I started with 1Password, which is where I have activation keys and other required info. stored.


I haven't yet decided how to back up the computer. It has a 2 TB SSD, and the external SSDs I've used for backup are 1 TB. Unfortunately, SSDs are still pricier than spinning disks, so I may go that route at the expense of speed. I'd like to hear what y'all suggest.


I can definitely hear some low level white noise. Once you know it's there, the sound is difficult to completely ignore, but it's not objectionable and won't affect my daily use, since it's quieter than typical ambient levels in my home office. I haven't yet exercised the machine enough for the fans to ramp up, though.


It's certainly faster than the 2017 iMac it's replacing. I'll know more tomorrow when I use it for some work projects. I will say that it boots a lot more quickly, LOL.


I still have a lot of peripherals with USB A connectors, including a printer, microphone, wired mouse, and some other stuff. My old 4-port hub is doing fine, though.


I bought a Touch ID keyboard yesterday. It's identical to my old keyboard apart from the Touch ID sensor, which is incredibly convenient.


----------



## MEJHarrison

MEJHarrison said:


> I'm getting ready to order the 24-core M1 Max version.  I was set on 1TB.  Then I noticed that my current 1TB is 80% full.  I looked into external SSDs, but nothing can come close to Apple's SSD speeds.  So I might just suck it up and get a 2TB model.




Still haven't ordered it.  At this point the shipping date is May 23-May 30.  I think I'll just wait till they slip into early June.  Then hopefully I'll get it around the time of WWDC.  That way if there's any announcement there that knocks my socks off, I can just send the Studio back and still be within that 14-day window .  If not, I'll be ready to play with new betas.


----------



## Yoused

It occurred to me yesterday, when the power blinked and then was out for an hour or so, years ago, I had a big ugly UPS that saved me some lost work a couple times: given that the M-series is geared toward power efficiency, would it make sense for Apple to put a small battery or supercap in its desktop models to allow them a chance to save state when the power goes out unexpectedly?


----------



## throAU

Cmaier said:


> The reason I turned down a job at Intel and instead first worked on a PowerPC and then on a sparc was that I really really wanted something other than x86 to be competitive. Sadly I had to give in, because working with good people and enjoying my job actually mattered (and the awesome team for the PowerPC was scattered when we couldn’t get more funding).




This is why I'm such an ARM/Apple Silicon cheerleader.  Wanting something other than x86 to be competitive/win.  X86 is just.... really inherently garbage.  Anyone who has coded for the thing in assembly language will attest - it's just nasty. Doesn't matter how fast they make it run, it just plain feels... dirty.

Having x86 be the de-facto standard just... hurts.


the performance speaks for itself in terms of performance per watt
whilst no doubt not as old as some here, I'm old enough to remember the 80s computing landscape, and machines like the Amiga, ST and Archimedes (also the 8 bits before them, but those were properly powerful in their time).  Especially the archimedes.  It was the first ARM machine.  I always wanted one, and with the Apple Silicon, Pis, etc. I at least have a descendant


----------



## Eric

Roller said:


> A few preliminary observations:
> 
> Setup went much faster than expected. I copied everything but apps from a Carbon Copy Cloner backup, which took ~80 minutes for ~700 GB. Next, I reinstalled all my apps working off a list I prepared in advance. Downloading and running installers was quick, too. I started with 1Password, which is where I have activation keys and other required info. stored.
> 
> 
> I haven't yet decided how to back up the computer. It has a 2 TB SSD, and the external SSDs I've used for backup are 1 TB. Unfortunately, SSDs are still pricier than spinning disks, so I may go that route at the expense of speed. I'd like to hear what y'all suggest.
> 
> 
> I can definitely hear some low level white noise. Once you know it's there, the sound is difficult to completely ignore, but it's not objectionable and won't affect my daily use, since it's quieter than typical ambient levels in my home office. I haven't yet exercised the machine enough for the fans to ramp up, though.
> 
> 
> It's certainly faster than the 2017 iMac it's replacing. I'll know more tomorrow when I use it for some work projects. I will say that it boots a lot more quickly, LOL.
> 
> 
> I still have a lot of peripherals with USB A connectors, including a printer, microphone, wired mouse, and some other stuff. My old 4-port hub is doing fine, though.
> 
> 
> I bought a Touch ID keyboard yesterday. It's identical to my old keyboard apart from the Touch ID sensor, which is incredibly convenient.



Very much the same experience for me, only my iMac was from 2015 and had become so slow to the point I wanted to throw it through the window at times lol. I got the base model Mac Studio, the biggest drawback as the smaller 512GB drive and I didn't feel like waiting around for a custom built model to up it to 1TB so I made it work, needed to get rid of some old data anyway.

Since I've already been using the M1 on my MBP I have an idea of just how capable this system is, it smokes on the Mac Studio just as well and I'm more than happy with it. The only real thing I'm missing is that beautiful 5K monitor that was built-in to the iMac, I ended up with a LG 27” IPS 4K UHD and it's not bad but to get the same quality I had before would be out of my price range, it's a worthy tradeoff though.

Adobe LR and PS open up nice and fast now, though I'm still at the beck and call of my external USB drive to load my photos it's still far more usable. I also ended up getting the Touch ID keyboard, great feature on the MBP that I also like having on the new Studio.


----------



## ArgoDuck

For anyone interested Craig Hunter has posted the review I’d hoped to see, concerning the cpu capabilities of the studio and the ultra in particular, applied in a scientific or engineering context. It’s here: hrtapps dot com/blogs/20220427/

His comparison with several generations of Mac Pro (up to the 28 core), iMac Pro (18) and I think a MacBook Pro from a few years ago is quite interesting…


----------



## Yoused

He shows the Ultra pantsing a 28 core Cascade Lake processor using only 6 cores. This is on an airfoil flow dynamics calculation. What I wonder about, though, is why he is running this on the CPU. It seems like this sort of work is the embarrassingy parallel stuff that belongs on the GPU.






Interesing that the performance does not seem to decay as more cores are recruited. Apple's bus bandwidth appears to be just that good.


----------



## ArgoDuck

^ true that, about why not GPUs. But so refreshing to see a review not focussed on content creation 

Apparently, he plans a GPU review in future.

The almost ruler straight linear relationship between performance and cores is - as you say - just remarkable. I’d hoped for something like this…


----------



## mr_roboto

Yoused said:


> He shows the Ultra pantsing a 28 core Cascade Lake processor using only 6 cores. This is on an airfoil flow dynamics calculation. What I wonder about, though, is why he is running this on the CPU. It seems like this sort of work is the embarrassingy parallel stuff that belongs on the GPU.



I'm guessing the software package he used, NASA TetrUSS, doesn't support GPU.

Your comment made me curious and according to my highly scientific googling process, CFD software has only begun to get ported to GPUs quite recently.

I suspect it might be one of those categories of simulation software which isn't always as embarrassingly parallel as you'd hope, or takes a lot of work to transform into a fully embarrassingly parallel problem.  I have no idea if this is how CFD actually works, but if a CFD solver divides space up into cells and simulates what's happening to each one, well, for every time step the cells probably need a lot of cross-communication.  Lots of neighbor influence.  GPUs are at their best when each GPU core gets to do the same math on different, completely independent data.


----------



## Andropov

mr_roboto said:


> I suspect it might be one of those categories of simulation software which isn't always as embarrassingly parallel as you'd hope, or takes a lot of work to transform into a fully embarrassingly parallel problem.  I have no idea if this is how CFD actually works, but if a CFD solver divides space up into cells and simulates what's happening to each one, well, for every time step the cells probably need a lot of cross-communication.  Lots of neighbor influence.  GPUs are at their best when each GPU core gets to do the same math on different, completely independent data.



Here's my guess of what the problem might be (I don't know, it's just a guess) with making it parallel. First: a recap of how fluid simulations are done on GPUs, from the book GPU Gems 3, in case you're interested. The article focuses the Navier-Stokes equations for *incompressible flow*. But real fluids are compressible to some degree. Some more than others, obviously. Water can be approximated as incompressible in many places, but for fluids like air in the atmosphere, compressibility can't be ignored in any realistic setting.

Anyway: the method in the article cleverly uses an implicit integration method to handle advection (see the Advection section). That works because the fluid is incompressible. Hence, the influx particles and particle associated quantities of each cell come from another, *single* cell. So that's massively parallel, almost trivially. For each cell, you receive flow from another cell, whose properties you can look up. The amount of work you need to do per cell is fixed.

Now, for compressible flow? Advection can come from many cells at once (or from none at all). There's no upper bound to how many neighbouring cells can contribute to a single cell. The whole cell-as-a-particle approximation they do in the article breaks down. You can't trace back where the 'cell particle' comes from because there's an unknown number of 'particles' on each cell (not just one). So you could have cells where advection comes primarily from a single other cell (like in incompressible flow) and cells at high pressure with influx flow from thousands of other cells. 

I'm sure there are clever ways around this, but I think that looks like a massive roadblock in the way of a GPU-capable implementation. On the CPU, there's no such problem. Integrate the equations explicitly (with something like Runge-Kutta or Euler), and for every origin cell you process, you modify (serially) whatever cells are affected by the origin cell. Since the implementation is serial, whether a given destination cell happens to be modified once or a thousand times is a non-issue.


----------



## MEJHarrison

Continuing with my completely unscientific tests, this time I performed a thermal test.  Last week I stuck a Hershey's bar (dark) on top of the case.  Yesterday I opened it up and it wasn't melted at all.


----------



## Cmaier

MEJHarrison said:


> Continuing with my completely unscientific tests, this time I performed a thermal test.  Last week I stuck a Hershey's bar (dark) on top of the case.  Yesterday I opened it up and it wasn't melted at all.




That’s how it’s done.


----------



## Citysnaps

MEJHarrison said:


> Continuing with my completely unscientific tests, this time I performed a thermal test.  Last week I stuck a Hershey's bar (dark) on top of the case.  Yesterday I opened it up and it wasn't melted at all.




Dr. Feynman nods and smiles from up above!


----------



## Colstan

Cmaier said:


> That’s how it’s done.



Clearly, this is how @Cmaier and AMD solved Athlon XP thermal issues. To prevent spontaneous semiconductor combustion, due to lack of an on-die gamma-ray spectrometer to prevent cosmic rays, the Opteron team used cacao butter as thermal paste to supplement copper, aluminum, and cesium heatsinks. Instead of worrying about a crushed die during a bad mount, they'd use a Twix bar as a shim, and that's after applying a licorice-derived compound to the substrate. Rumor has it that AMD even had Snickers bars inside hermetically sealed containers that were labeled "in case of emergency, smash sugar glass". During the infamous lederhosen shortage at the Dresden fab, Jerry Sanders allegedly kept handy boxes of Schogetten, just in case of pipeline stalls in AMD's Itanium workstations. Not only do "real men have fabs", but they have mostly authentic candy to go with their Jägermeister-inspired CPU designs. At the time, it was widely reported among the tech press that @Cmaier learned German to communicate with AMD's Prussian employees, despite the heavy enforcement of a pantaloons-only dress code, but it was in fact to ensure ample supplies of Schwarzwälder Kirschtorte, just in case Fab 30 went offline, after the now-defunct Fab 69 was a proven frustration, when AMD's deeply embedded servers failed penetration testing, once programs were compiled using Teutonic tart flags.

I learned all of this from the well-informed, historically accurate, highly logical, even-handed, scientifically astute, and very respectful posts over at the MR forums, when I made an inquiry with fanboys of AMD (Amazing Milk-chocolate Delicacies).


----------



## Cmaier

Colstan said:


> Clearly, this is how @Cmaier and AMD solved Athlon XP thermal issues. To prevent spontaneous semiconductor combustion, due to lack of an on-die gamma-ray spectrometer to prevent cosmic rays, the Opteron team used cacao butter as thermal paste to supplement copper, aluminum, and cesium heatsinks. Instead of worrying about a crushed die during a bad mount, they'd use a Twix bar as a shim, and that's after applying a licorice-derived compound to the substrate. Rumor has it that AMD even had Snickers bars inside hermetically sealed containers that were labeled "in case of emergency, smash sugar glass". During the infamous lederhosen shortage at the Dresden fab, Jerry Sanders allegedly kept handy boxes of Schogetten, just in case of pipeline stalls in AMD's Itanium workstations. Not only do "real men have fabs", but they have mostly authentic candy to go with their Jägermeister-inspired CPU designs. At the time, it was widely reported among the tech press that @Cmaier learned German to communicate with AMD's Prussian employees, despite the heavy enforcement of a pantaloons-only dress code, but it was in fact to ensure ample supplies of Schwarzwälder Kirschtorte, just in case Fab 30 went offline, after the now-defunct Fab 69 was a proven frustration, when AMD's deeply embedded servers failed penetration testing, once programs were compiled using Teutonic tart flags.
> 
> I learned all of this from the well-informed, historically accurate, highly logical, even-handed, scientifically astute, and very respectful posts over at the MR forums, when I made an inquiry with fanboys of AMD (Amazing Milk-chocolate Delicacies).




Das stimmt. Ausgezeichnet.


----------



## Yoused

Intel, on the other hand, puts a tiny 238-PuO2 RTG inside each of their processors to supplement the heat output.


----------



## Agent47

Cmaier said:


> Das stimmt. Ausgezeichnet.



Lächerlich. Absoluter Humbug. In Dresden gibt es keine Lederhosen


----------



## Cmaier

Agent47 said:


> Lächerlich. Absoluter Humbug. In Dresden gibt es keine Lederhosen




Wir haben die Lederhose aus Sunnyvale mitgebracht.  Alle coolen Kinder in Silicon Valley tragen Lederhose. 

Einmal, als wir in Italien waren, versuchte ein Typ aus Dresden, der neben uns in einer Pizzeria saß, sich zu meiner Frau und meinem Hotelzimmer einzuladen. Mein Deutsch war kaum gut genug, um uns aus diesem Problem herauszuholen.  Oder schlecht genug um uns in dieses Problem zu bringen. 

Mein Punkt ist einfach, dass jeder in Dresden ein Lederhosenträger Verrückter sein muss.


----------



## Agent47

I got my first Lederhose at the age of 3. Genuine stag. I didn‘t grow up in Dresden though


----------



## Cmaier

Agent47 said:


> I got my first Lederhose at the age of 3. Genuine stag. I didn‘t grow up in Dresden though



Wohnst du in Österreich? Meine Großeltern kamen aus Wien, also spreche ich Deutsch wie ein Osterreicher.  Wenn das Jahr 1940 wäre.  Und wenn ich in der zweiten klasse war. 

Meine Oma beschrieb meine Akzent als „süß“. Sie war sehr höflich.


----------



## Agent47

Ja, ca 40 Minuten von Wien, nähe Steyr


----------



## Cmaier

Agent47 said:


> Ja, ca 40 Minuten von Wien, nähe Steyr



Very cool. I’m actually in the process of applying for Austrian dual-citizenship, though the process will take forever.


----------



## Agent47

Cmaier said:


> Very cool. I’m actually in the process of applying for Austrian dual-citizenship, though the process will take forever.



Yeah, but I believe if you can prove your heritage they‘ll grant it. Bureaucracy is kinda slow, but seems to be improving.
Hehe, and avoid the draft!


----------



## Cmaier

Agent47 said:


> Yeah, but I believe if you can prove your heritage they‘ll grant it. Bureaucracy is kinda slow, but seems to be improving.
> Hehe, and avoid the draft!




I am way too old for the draft


----------



## Agent47

Cmaier said:


> I am way too old for the draft



I‘m aware, still would be funny to see you in uniform


----------



## Agent47

or even better to see you crawling through the sludge, while the drill instructor is yelling at you 

scnr, too hilarious


----------

