Mac Studio

mid april unfortunately

I’ll be sure to run stockfish benchmarks lol
I may be the other person @Cmaier referred to, but also looking at a mid-April delivery for the Mac Studio and 1 week later for the Studio Display.

I won't be at all surprised if Apple announces a higher-end monitor with mini LED and ProMotion around WWDC, but I can't imagine it'll be < $3500 - $4000 given the cost for the Pro Display XDR. Even if the price for the PD comes down, the new display will be more expensive than I'm comfortable with.

I keep looking for cheaper alternatives to the Studio Display, but am not optimistic. I don't know if this is true for other people, but with my vision I need a high-res monitor that renders text sharply even at scaled resolutions and also lets me zoom in and out as needed. High brightness is also a plus. I spend much more time working at home than ever, and I'm willing to pay a premium for these features.
 
I don't know if this is true for other people, but with my vision I need a high-res monitor that renders text sharply even at scaled resolutions and also lets me zoom in and out as needed. High brightness is also a plus. I spend much more time working at home than ever, and I'm willing to pay a premium for these features.

Same here. I immediately noticed the difference on my 2017 27" 5K iMac. And can never go back to a lesser display on a desktop Mac.
 
First video tear down of the M1 Ultra Mac Studio. From Max Tech. Very cool. And yes the SSDs are socketed.



Does that massive rectangle of exposed copper from the motherboard surrounding the SoC have a purpose (other than looking great)? EMI shielding the rest of the board from the SoC?

Also wow, great build quality.
 
Does that massive rectangle of exposed copper from the motherboard surrounding the SoC have a purpose (other than looking great)? EMI shielding the rest of the board from the SoC?

Also wow, great build quality.

Hard to say. I’d have guessed thermal, but EMI works too - from the video it seems what you’ve got there is essentially a faraday cage, whether that was the goal or not.

Normally, though, it’s the board, not the chip, where you run into EMI problems.
 
Maybe not so much. For the high-end Ultra, that would mean the equivalent of 4K "cores", which is a very long way from the 18K number I was seeing for some nVidia cards. Although, the power usage on those cards alone is in the 300W range, which is probably quite a bit higher than the peak draw of an entire Ultra SoC.

Yeah Nvidia’s core counts are huge but if you conmpare to AMD, AMD is still 64 cores per CU. Hence their top 80 CU part has 5120 cores; about half of Nvidia’s 3090 but in some cases similar performance.

Apples/Oranges - what matters is throughput, not core count.
 
I was just reading this bit on the SSD sockets and how storage is not straight-up swappable due to encryption. It made me wonder, who here has made an external boot drive for an AS Mac? How difficult is it?

I’d probably look at eclecticlight’s articles about how it all works. It’s doable (especially if you avoid the early Big Sur releases), but it still requires the internal SSD to be functional to boot to the external drive.
 
I may be the other person @Cmaier referred to, but also looking at a mid-April delivery for the Mac Studio and 1 week later for the Studio Display.

I won't be at all surprised if Apple announces a higher-end monitor with mini LED and ProMotion around WWDC, but I can't imagine it'll be < $3500 - $4000 given the cost for the Pro Display XDR. Even if the price for the PD comes down, the new display will be more expensive than I'm comfortable with.

I keep looking for cheaper alternatives to the Studio Display, but am not optimistic. I don't know if this is true for other people, but with my vision I need a high-res monitor that renders text sharply even at scaled resolutions and also lets me zoom in and out as needed. High brightness is also a plus. I spend much more time working at home than ever, and I'm willing to pay a premium for these features.
I’m probably pairing my Studio with a Lenovo P27u-20 27” 4k monitor. Excellent colour gamut, thunderbolt/displayport/hdmi inputs and acts like a hub (even Ethernet, not that the Studio needs it but attached laptops might),”mini”-LED backlight. Huawei Mateview 28 is in the running too.
The display market is in a transition to HDR, and neither technology or prices are mature unfortunately. For once, this is a situation where, depending on your interests, you might not buy a screen to do the job for the next decade+.

Edit before replies: Both the above displays offer connectivity to several video sources over several input connector options (the deciding factor for me), and are of course height adjustable without paying another $400 + taxes.
 
Last edited:
With the recent announcement by Apple that a firmware update will be forthcoming for the Studio Display partly to allow the webcam to fully utilize the DSP and to properly tune it to the display, who also expects a firmware update to the Studio itself? In this case I would think it would be upclocking the CPU and GPU cores plus other items so that the SOC is taking more advantage of the enhanced cooling.
 
With the recent announcement by Apple that a firmware update will be forthcoming for the Studio Display partly to allow the webcam to fully utilize the DSP and to properly tune it to the display, who also expects a firmware update to the Studio itself? In this case I would think it would be upclocking the CPU and GPU cores plus other items so that the SOC is taking more advantage of the enhanced cooling.

I tend to doubt it? It could be that they tune the throttling mechanisms, but I doubt they’ll raise the top speed.
 
I was just thinking they might upclock mildly because at present neither the Max nor the Ultra are coming even close to full use of the cooling. Or maybe the overpowered cooling is future proofing against the M2 Max and Ultra?
 
I was just thinking they might upclock mildly because at present neither the Max nor the Ultra are coming even close to full use of the cooling. Or maybe the overpowered cooling is future proofing against the M2 Max and Ultra?

The issue would be that they probably are already clocking as fast as they feel comfortable - any faster and the die simply might not work. Chips have a “critical path” through transistors and wires that determines the maximum clock frequency - 1 divided by however long it takes a signal to propagate through that path. The length of that path (and even which path is the critical path) can vary from chip to chip and wafer to wafer. So they test everything at a certain speed when it comes off the assembly line, and they wouldn’t every make it run faster than that.

If the chips had tested good at a higher clock rate, they’d probably already be running at that clock rate.
 
The issue would be that they probably are already clocking as fast as they feel comfortable - any faster and the die simply might not work. Chips have a “critical path” through transistors and wires that determines the maximum clock frequency - 1 divided by however long it takes a signal to propagate through that path. The length of that path (and even which path is the critical path) can vary from chip to chip and wafer to wafer. So they test everything at a certain speed when it comes off the assembly line, and they wouldn’t every make it run faster than that.

If the chips had tested good at a higher clock rate, they’d probably already be running at that clock rate.
Point taken. It is likely then part quietness and part futureproofing. With this design they can easily (assuming our assumptions are correct) slide an M2 Max or Ultra into the same system.
 
Point taken. It is likely then part quietness and part futureproofing. With this design they can easily (assuming our assumptions are correct) slide an M2 Max or Ultra into the same system.
Yep.

Just saw one in person at the apple store. Bigger than I imagined. No discernible fan noise.

I imagine in the future the power will go up primarily on the gpu side, and this design can accommodate that. Not a repeat of the trash can Mac Pro.
 
So I got mine! I cannot hear it at all unless I put my ear right next to it. Even under load. Currently I’m transcoding all my flac files to alac so I can put them in Apple Music. I love this thing. Paired with the Studio Display it’s beautiful! I’m coming from a 2K@144Hz display so this thing is blowing my mind. I hooked that up as a secondary display and oh my it hurts my eyes now! Once you go 5K you can’t go back!
 
So I got mine! I cannot hear it at all unless I put my ear right next to it. Even under load. Currently I’m transcoding all my flac files to alac so I can put them in Apple Music. I love this thing. Paired with the Studio Display it’s beautiful! I’m coming from a 2K@144Hz display so this thing is blowing my mind. I hooked that up as a secondary display and oh my it hurts my eyes now! Once you go 5K you can’t go back!

Yeah i just spent 15 minutes staring at three of those monitors at the Apple Store…so tempted. Let us know more about both purchases as you get more insights.

BTW - I forget - which CPU did you get?
 
I got the base M1 Ultra
That gives “base” a new meaning.

Damn, that thing must be so sweet. I used to write software at AMD that we used to design the chips. One thing I wrote was called agincourt. It was what we called a circuit classifier and static checker - it would look at the circuit netlist (the list of transistors and their interconnections) and try to Figure out what each circuit was, and whether it met certain rules. there was a commercial tool, but it would take all night to run. I came up with the idea of using tricks from vision recognition, and a partitioning algorithm we filed a patent on, to speed that idea up. My tool ran in about an hour for the whole chip, and since it could run in parallel (after the partitioning step), we could distribute it on many machines and get it down to 15 minutes. That was on AMD Athlon’s aand SPARC workstations at the time, and the chips we were designing were Opterons, which had only 100 million transistors. With Ultra we could undoubtedly have done the whole thing in real time as we made edits to the chip, even if the chip was as big as ultra itself.
 
That gives “base” a new meaning.

Damn, that thing must be so sweet.
I admit that @Cmaier's enthusiasm is infectious. I'm used to him being strictly logical and rational, while occasionally displaying his trademark acerbic humor and wit. Apple must have done something remarkable if it gets a veteran CPU architect's enthusiastic attention.

Over at "the other place", he compared the time we are now living in to the computer wars of the 80s. Back then you could walk into a software store (yes, they actually existed) which had isles for a half-dozen computer systems, each running different operating systems, with substantially different underlying hardware architectures.

Then things got boring. Microsoft dominated with Windows. The classic Mac OS was religated to 1-2% marketshare mainly used for desktop publishing and graphic arts. Linux was nothing more than a curiosity, at best. RISC designs slowly disappeared from desktops and workstations.

Even as competition improved, CPUs didn't. It wasn't too long ago that Intel had stagnated at 4-cores because AMD wasn't competing, and nobody else challenged them. Apple seemed to have little interest in updating the Mac. I recall TidBITS running an article about how the Mac was quickly becoming a device for older generations, and once those folks essentially died out, the Mac would likely go with them. The Mac would become a legacy product, a side project of the iPhone company.

Apple did the exact opposite, completely revitalizing the Mac with their own custom SoC, new industrial designs, and macOS getting a complete overhaul. AMD is back in the game, forcing Intel to innovate, including getting into graphics cards. New desktop ARM designs are coming from the likes of Qualcomm and Nvidia. Microsoft is trying new and interesting things with Windows.

The traditional desktop computer market hasn't been this exciting in decades, and it's great to see, after experiencing stagnation for so long.
 
Back
Top