Oct 18 Apple Event

I would say it’s the opposite. The CPU is faster than any desktop mac other than a pro with 16 or more cores, including 12-core Mac Pros.

The GPU is nice, but not “high end desktop GPU”-nice.

I’ll be using it for Xcode, photo editing, Illustrator, electronics simulations, Office, video transcoding, etc.
What I meant is that as you upgrade from M1Pro to M1Max seems to add disproportionately more GPU over CPU power. (10/16 to 10/24 or 10/32).

Me too.
I‘ll port my CUDA-based genetic algorithm over to Metal. The thing is heavily memory-bandwidth and memory limited. Curious on how it is going to perform on ASi/Metal
I'd love to see the M1s replacing CUDA, but I doubt we're anywhere near. I'd also be curious what performance differential could be expected with that.

To some extent, people need to buy headroom, since the machines aren't upgradeable.
It then comes down to their predictions of the future, and how they approach hardware purchases. The professionals I know lean more towards "working with what they know", rather than "buying new toys", generalising broadly. Thus they would tend to go for something that will do the job, and then some, for the foreseeable future.
But the future has been pretty predictable in the past decade. Except for the wattage/battery life and the non-existent GPU, Macs have been adequately powered for office-based work for the past at least 5 years.

Where I always needed extra power were usually CUDA based stuff, so if Apple doesn't create a CUDA alternative, I'd still have to stick with a desktop with a GTX card.
 
What I meant is that as you upgrade from M1Pro to M1Max seems to add disproportionately more GPU over CPU power. (10/16 to 10/24 or 10/32).


I'd love to see the M1s replacing CUDA, but I doubt we're anywhere near. I'd also be curious what performance differential could be expected with that.


But the future has been pretty predictable in the past decade. Except for the wattage/battery life and the non-existent GPU, Macs have been adequately powered for office-based work for the past at least 5 years.

Where I always needed extra power were usually CUDA based stuff, so if Apple doesn't create a CUDA alternative, I'd still have to stick with a desktop with a GTX card.
Ah, yes, definitely the main improvement from Pro to Max is GPU. Of course, memory bandwidth as well. I believe the max has double the SLC of the pro? If so, that will improve CPU performance for some tasks, even if it doesn’t show up in benchmarks (which typically do not focus on that sort of memory access pattern.)
 
I would say it’s the opposite. The CPU is faster than any desktop mac other than a pro with 16 or more cores, including 12-core Mac Pros.

The GPU is nice, but not “high end desktop GPU”-nice.

I’ll be using it for Xcode, photo editing, Illustrator, electronics simulations, Office, video transcoding, etc.

It's going to toe to toe with GTX 3080M's from what I hear. While the mobile cards aren't as fast as their desktop counterparts, they're still pretty stout performers.
 
It's going to toe to toe with GTX 3080M's from what I hear. While the mobile cards aren't as fast as their desktop counterparts, they're still pretty stout performers.

sure. My point was simply that, as far as CPU is concerned, MBP actually *does* beat almost any desktop. In GPU, that’s not the case (Apple will take care of that over the next few years worth of chip designs :-)
 
I'd love to see the M1s replacing CUDA, but I doubt we're anywhere near. I'd also be curious what performance differential could be expected with that.
True. Which is why I said I‘m curious on the result. Thing is: GAs do not work like Machine Learning algos; they do not usually access memory contiguously.
CUDA accesses memory in 32 bit (if memory serves me) chunks. When you just need one bit of this chunk, CUDA still loads all 32 bits.
So non-coalesced memory access is all but optimal; yet GAs are still much faster due to massive parallelism (plus general superior memory throughput).

Therefore, in theory, due to this 32 bit loading thing absent, memory access using the M1 architecture should be much faster in my Apps’ particular case.
If my assumptions prove to be halfway true I should see a significant jump in performance (which is, I believe, already pretty good)

Btw O/T: how did you guys receive the „Vaccinated“ badge?
 
Last edited:
True. Which is why I said I‘m curious on the result. Thing is: GAs do not work like Machine Learning algos; they do not usually access memory contiguously.
CUDA accesses memory in 32 bit (if memory serves me) chunks. When you just need one bit of this chunk, CUDA still loads all 32 bits.
So non-coalesced memory access is all but optimal; yet GAs are still much faster due to massive parallelism (plus general superior memory throughput).

Therefore, in theory, due to this 32 bit loading thing absent memory access using the M1 architecture should be much faster in my Apps’ particular case.
If my assumptions prove to be halfway true I should see a significant jump in performance (which is, I believe, already pretty good)

Btw O/T: how did you guys receive the „Vaccinated“ badge?

“Vaccinated” is in the “preferences” part of the profile editing screen.
 
But the future has been pretty predictable in the past decade. Except for the wattage/battery life and the non-existent GPU, Macs have been adequately powered for office-based work for the past at least 5 years.

Where I always needed extra power were usually CUDA based stuff, so if Apple doesn't create a CUDA alternative, I'd still have to stick with a desktop with a GTX card.
What I meant about taking headroom for future needs was not for computer development (which is predictable) bút for uncertainty where ones professional needs might move - for instance will a photographer need to move ever more into video? If so, at what quality level, which customers, how will customer expectations evolve, how….better get 32GB RAM. ;) That kind of thing.
 
What I meant about taking headroom for future needs was not for computer development (which is predictable) bút for uncertainty where ones professional needs might move - for instance will a photographer need to move ever more into video? If so, at what quality level, which customers, how will customer expectations evolve, how….better get 32GB RAM. ;) That kind of thing.
I get it, but the differential is easily $2K. 2-3 years down the line that's just a brand new machine which would have these capabilities for the cost of the differential. It's a major differential for personal computing. It's minute for an enterprise where this is <1% of the budget. Just saying. Most of us with corporate(-like) jobs are stuck with windows for work.:/
 
I would imagine we will see two more chips for the Mac Pro (and maybe iMac Pro):

iMac Ultra - 20 CPU cores, 64 GPU cores, 64 GB RAM
iMac Ultra Duo - 40 CPU cores, 128 GPU cores, 128 GB RAM

I was pretty close.
 
The most shocking thing about this event isn't the powerful new hardware on display, but how relatively affordable it all is.
 
Very impressive interconnect performance, Studio looks like a great compact desktop workstation, pretty much an unprecedented product.

Love the display, but that price tag… ugh... I mean, I would pay (with some reluctance) around a thousand bucks for a nice display, but what they charge is just insane… not entirely unreasonable given the market, but still insane.
 
Very impressive interconnect performance, Studio looks like a great compact desktop workstation, pretty much an unprecedented product.
This is the mid-range desktop that I've wanted since I got my first Mac mini in 2005. It was frustrating how the only product in that category, on the desktop side, was the iMac. I've always owned minis because I wanted flexibility with the display and didn't want the included mouse and keyboard. Now, I'll be buying my first Mac that isn't a mini, and I won't be forced to have the display integrated and have extra peripherals that I won't use. Also, the pricing for the Max version isn't as unreasonable as I had been concerned about.

Love the display, but that price tag… ugh... I mean, I would pay (with some reluctance) around a thousand bucks for a nice display, but what they charge is just insane… not entirely unreasonable given the market, but still insane.
Seeing how rumors were around $2500, it isn't as bad as it could have been. Given that the LG that it replaces was $1,300, the more advanced microphone, camera, and quality speakers make up for some of that. Plus, you get Apple's support and industrial design with it. Like the Mac Studio, this is the monitor I have wanted for years now.

Also, I would note that the 27-inch iMac is no more, leaving just the Mac Pro to transition. In all likelihood, the iMac Pro wasn't ready for release, so this is the current solution. Rumors put the iMac Pro and Mac Pro as 2023 products, so it makes sense. I also noticed that Apple still has the 6-core Intel Mac mini for sale on their online store, which I'm assuming is for those who still need x86 compatibility for legacy reasons.
 
Very impressive interconnect performance, Studio looks like a great compact desktop workstation, pretty much an unprecedented product.

Love the display, but that price tag… ugh... I mean, I would pay (with some reluctance) around a thousand bucks for a nice display, but what they charge is just insane… not entirely unreasonable given the market, but still insane.
The crappy LG 27" 5K UltraFine is $1300 so it isn't surprising that the Apple version is $1600.
 
Back
Top