M2 Pro and M2 Max

exoticspice1

Site Champ
Posts
309
Reaction score
114
They will fare better obviously but still at a huge power consumption cost. The leaked TDPs of mobile series 40 are hardly inspiring
What cost? The 40 series are more efficient than previous gen and uses 100 watts less power while being more powerful in raster, RT and also having lower temps. If Nvidia uses the same 5nm arch and the same Ada arch I expect efficiency improvements as well for the laptops cards.

Not everyone is Intel pumping up more more watts every gen to make speed bumps.
 

Jimmyjames

Site Champ
Posts
767
Reaction score
870
Interesting article on the P cores and E cores in the M2 Max.

1674499234951.png


 

theorist9

Site Champ
Posts
637
Reaction score
596
3D Wildlife Extreme test. No idea how it compares with Nvidia and AMD.
What's the link for that? I can't tell from the graphic if that's the 30-core or 38-core M2 Max. [Always good to post links anyways, to give the person who created the graphic credit.]
If it's the 30-core, the scaling from the 10-core M2 is 98%; if it's the 38-core, then it's 77%.

Edit: I emailed the author, and he said it's the 38-core. So 77% scaling relative to the M2. He also said that they didn't use high-power mode when testing it.
 
Last edited:

leman

Site Champ
Posts
701
Reaction score
1,309
Yeah that would VERY hard. Unlike AMD and Intel, Nvidia is competent and uses the lastest nodes from TSMC now. Plus cuda and Optix are great too. Their RT is also industry leading.

My take is on Apple's Rt will be on par with the 30 series at I hope it will be.

Yeah, Nvidia has been working on it for a while. Apple is a newcomer. But judging by published patents their approach is different enough to be promising. And they have some of the advanced features of the 40x series like ray compacting etc.

What cost? The 40 series are more efficient than previous gen and uses 100 watts less power while being more powerful in raster, RT and also having lower temps. If Nvidia uses the same 5nm arch and the same Ada arch I expect efficiency improvements as well for the laptops cards.

What you write here is different from the leaked product specs. But I’d like to wait and see.

The 3DMark tests are different on PC and also incorporate RT for some tests and and are also more demanding but at least with wildlife extreme we can compare with past Apple GPUs.

Wildlife is specifically designed to be a cross-platform test, it only uses features available on all platforms (at least according to benchmark description).

BTW, that result is comparable with a 3070 (laptop). Not too bad.
 

exoticspice1

Site Champ
Posts
309
Reaction score
114
What you write here is different from the leaked product specs. But I’d like to wait and see.
I was referring to their RTX 40 desktop lineup. We can already see the benefits of Ada and TSMC 5nm on desktop. No reason that would not apply to the laptop lineup as well it's based on Ada and 5nm too.
 

exoticspice1

Site Champ
Posts
309
Reaction score
114
Wildlife is specifically designed to be a cross-platform test, it only uses features available on all platforms (at least according to benchmark description).
Ahh my bad. I was thinking of the steam 3DMark which is completely made for PCs


The Wildlife one that I linked shows that your right. Someone has a RTX 4080 to compare with? ;)
 

exoticspice1

Site Champ
Posts
309
Reaction score
114
Interesting the Steam version also has Wildlife extreme along with a bunch other ones. I must say 3DMark sounds infinitely more fun than Geekbench when it comes to benchmarking GPUs.

 

leman

Site Champ
Posts
701
Reaction score
1,309
The Wildlife one that I linked shows that your right. Someone has a RTX 4080 to compare with? ;)

Probably north of 70k points, judging by the Ultra score. An entirely different beast to be sure. But also almost an order of magnitude higher power consumption.
 

exoticspice1

Site Champ
Posts
309
Reaction score
114
Seems a bit low given that the 4090 is in 80k+, but who knows.
I think that's alright cause the 4090 has 16000 cuda cores while 4080 has around 9500.

EDIT: looks like the 4070 Ti is around 45K. If the M2 Ultra scales well in Wild life it should around 40K. Kinda sad that Apple can't even match the 4070 Ti unless Apple increase clocks.

If the M2 Ultra is high as it goes in the Mac Pro then the GPU bound tasks excluding the ones that benefit from the large unified memory will be slower than a RTX 4070 Ti.

That's not even mentioning the price. The 4070 Ti is $800 - $850. Apple charges $1000 to go from 48 cores to 64 cores on the M1 Ultra. Very greedy.
 
Last edited:

Jimmyjames

Site Champ
Posts
767
Reaction score
870
I think that's alright cause the 4090 has 16000 cuda cores while 4080 has around 9500.

EDIT: looks like the 4070 Ti is around 45K. If the M2 Ultra scales well in Wild life it should around 40K. Kinda sad that Apple can't even match the 4070 Ti unless Apple increase clocks.

If the M2 Ultra is high as it goes in the Mac Pro then the GPU bound tasks excluding the ones that benefit from the large unified memory will be slower than a RTX 4070 Ti.

That's not even mentioning the price. The 4070 Ti is $800 - $850. Apple charges $1000 to go from 48 cores to 64 cores on the M1 Ultra. Very greedy.
The M1 Ultra already matches the 4070 in gfxbench 4K Aztec. The M2 Ultra should beat it easily.
 
Top Bottom
1 2