Mac Studio

I believe I'm on the verge of suspension. Have received 3 PMs from mods today :LOL:
MR's moderation policy is so dumb. They should've cleaned out obvious bad-faith trolls like mi7chy long ago, but refuse to, apparently out of some insane notion that permitting trolling is how you prevent forums from becoming echo chambers.

Since they coddle trolls, that means there's always little outbreaks of drama. They compound this by instinctively deleting posts. This is a bad way to moderate. For all its faults (and there are many), one of the things the Something Awful forums (for those not familiar, one of the earliest big forums on the internet) got right about moderation is that for the most part, bad posts are left intact. They just have the mod who took action on the post edit in a big USER WAS PROBATED/BANNED FOR THIS POST notice. It's a very simple and effective method of communicating to the community "see this? This is what we don't want here". Disappearing everything just means nobody really knows what will or won't get them in trouble.
 
Also very not cool to edit someone's posts to remove parts of them that relate to conversations that they've chosen to delete, as they sometimes do. I hadn't seen that in any other forum, ever, and I find it quite infuriating. Either delete my posts or don't delete them, but ffs don't edit them to remove just certain parts of it. Who on earth thinks that's a good idea?
 
MR's moderation policy is so dumb. They should've cleaned out obvious bad-faith trolls like mi7chy long ago, but refuse to, apparently out of some insane notion that permitting trolling is how you prevent forums from becoming echo chambers.
See, I'm not against trolling per se --- I even find some of the trolling entertaining, and oftentimes find that trolled threads get lots of great, enlightening responses from those smarter than I (many of you have contributed to that).

The issue is that when someone is obviously trolling, you're not allowed to remotely suggest that trolling is taking place, or else you're insulting someone.
Since they coddle trolls, that means there's always little outbreaks of drama. They compound this by instinctively deleting posts. This is a bad way to moderate. For all its faults (and there are many), one of the things the Something Awful forums (for those not familiar, one of the earliest big forums on the internet) got right about moderation is that for the most part, bad posts are left intact. They just have the mod who took action on the post edit in a big USER WAS PROBATED/BANNED FOR THIS POST notice. It's a very simple and effective method of communicating to the community "see this? This is what we don't want here". Disappearing everything just means nobody really knows what will or won't get them in trouble.
This seems great, but would never fly at TOP. Revenues from high-traffic means that moderation needs to adapt to the soup de jour; setting an actual plumb line of what is acceptable doesn't allow micro-adjustments of moderation to optimize traffic.
 
See, I'm not against trolling per se --- I even find some of the trolling entertaining, and oftentimes find that trolled threads get lots of great, enlightening responses from those smarter than I (many of you have contributed to that).

The issue is that when someone is obviously trolling, you're not allowed to remotely suggest that trolling is taking place, or else you're insulting someone.
I actually agree, but if you're going to be heavyhanded and clamp down on fights like MR mods do, you have to be actually fair about why the fights happened, and that means banning the trolls. Instead, they let trolls get away with their bullshit as long as they maintain the thinnest veneer of pseudo-civility.

Basically, I'd prefer a trolls allowed + appropriate mocking allowed vibe, with mods stepping in to ban the worst or most boring and repetitive trolls. Failing that, I want no trolls allowed. Instead it's a weird place where the mods often help the trolls out, and even when they don't, you have to walk on eggshells when responding to trolls. It sucks a lot.
 
So how about them Mac Studio's?

:p
You've been working to keep things on topic this week I'll give you that.

It's a great machine and tempting, last year I got the new M1 MBP and it's waaay faster than my 27" iMac, just not sure I can justify the expense at this point but I'll take another look next year.
 
Is it outrageous to say that Mac Studio delivers very good value for money (and performance for energy) for those who can make use of it? You can buy the whole system for not much more than the Xeon 8160 when new (and I use at work). And it's even more insane to think that the whole Mac can sit on the desk (or in the drawer), not some 2U rack unit. I'd be even more worried if I am Intel (if Intel is not worried and feeling insecure already).
 
Is it outrageous to say that Mac Studio delivers very good value for money (and performance for energy) for those who can make use of it? You can buy the whole system for not much more than the Xeon 8160 when new (and I use at work). And it's even more insane to think that the whole Mac can sit on the desk (or in the drawer), not some 2U rack unit. I'd be even more worried if I am Intel (if Intel is not worried and feeling insecure already).

But alder lake or something! ;-)
 
But alder lake or something! ;-)
nVidia will kill them. Right?

One thing that confuses me is GPU "cores". The info on the nVidia 30 series says that it has something like 18K+ cores, but the M1 has 7 in the lowest-bin, up to 64 in the top-bin Ultra. What exactly is the linguistic difference here? Is an RTX "core" more like an EU in the M-series? And how does the Neural Engine factor into the difference?
 
One thing that confuses me is GPU "cores". The info on the nVidia 30 series says that it has something like 18K+ cores, but the M1 has 7 in the lowest-bin, up to 64 in the top-bin Ultra. What exactly is the linguistic difference here? Is an RTX "core" more like an EU in the M-series? And how does the Neural Engine factor into the difference?

You aren’t the only one. While I could very well be wrong, when looking at something like the CUDA core count on nVidia these are more analogous to the count of ALUs in a CPU. nVidia is counting individual processor units in a lot of their marketing, rather than the clusters of processor units (SM units). AMD and Apple use the count of clusters, labeling them as compute units (AMD) or cores (Apple). Even then they aren’t directly comparable, unless talking about different models in the same lineup.

Thankfully, reviewers tend to focus more on throughput.

As for the neural engine, my understanding is that it is analogous to the tensor cores.
 
Speaking of which... the M1 Ultra now has two fully unused Neural Engines.
Yeah, I'd love to know why they chose to keep the extra Neural Engines dark silicon. Doesn't even appear to be about yield, as people figured out how to tell which of the two neural engines is the live one on the M1 Max die, and (admittedly in an informal twitter survey) everyone seems to have the exact same one.
 
Yeah, I'd love to know why they chose to keep the extra Neural Engines dark silicon. Doesn't even appear to be about yield, as people figured out how to tell which of the two neural engines is the live one on the M1 Max die, and (admittedly in an informal twitter survey) everyone seems to have the exact same one.
Maybe this is a good theory?
https://www.twitter.com/i/web/status/1477678685935312899/

Maybe there's just no way (yet) to balance the load between the two Neural Engines, or they're working on the API to do it.
 
So I could replace my 2013 Mac Pro with the 48 GPU core 20 CPU core M1 Ultra Mac Studio with 2 TB and 64 GB for $4399. This compares pretty favorably to the price I paid for a 3.5 GHz 6 core Xeon 16 GB/512GB dual AMD FirePro D500 Mac Pro in 2014. That was $4299 but over the years I've upgraded to 64 GB and 2 TB SSD (I do not remember what those upgrades cost but it was certainly more than $100).

I would lose x86-64 VM compatibility but the Mac Studio might be a little faster. (GB5 864/5036 vs. 1793/24055).
 
Pixar’s going to buy a lot of ‘em, I bet.

AFAIK, Pixar uses a Linux-based pipeline...?

One thing that confuses me is GPU "cores". The info on the nVidia 30 series says that it has something like 18K+ cores, but the M1 has 7 in the lowest-bin, up to 64 in the top-bin Ultra. What exactly is the linguistic difference here? Is an RTX "core" more like an EU in the M-series? And how does the Neural Engine factor into the difference?

Multiply the ASi GPU core count by 64 to derive like numbers...
 
Multiply the ASi GPU core count by 64 to derive like numbers...
Maybe not so much. For the high-end Ultra, that would mean the equivalent of 4K "cores", which is a very long way from the 18K number I was seeing for some nVidia cards. Although, the power usage on those cards alone is in the 300W range, which is probably quite a bit higher than the peak draw of an entire Ultra SoC.
 
Yeah, I'd love to know why they chose to keep the extra Neural Engines dark silicon. Doesn't even appear to be about yield, as people figured out how to tell which of the two neural engines is the live one on the M1 Max die, and (admittedly in an informal twitter survey) everyone seems to have the exact same one.
I was wondering if this might be a "we left a secret in the M1 Max/Ultra" when they introduce Mac Pro.
 
Maybe not so much. For the high-end Ultra, that would mean the equivalent of 4K "cores", which is a very long way from the 18K number I was seeing for some nVidia cards. Although, the power usage on those cards alone is in the 300W range, which is probably quite a bit higher than the peak draw of an entire Ultra SoC.

And those massive GPU dies are the size of a M1 Max or more, but no pesky CPU cores, or Neural Engine cores, or Media Engines, or all that stuff to take room away from more CUDA cores...
 
Last edited:
Back
Top