Alder Lake

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,209
Reaction score
8,250

Those power numbers are, as predicted, insane. Base 125W, Turbo 241W. Max 272 W. For just a CPU.

Single thread P-core uses 55-65 W. By itself. 11-15 W for the E-core.

Weirdly, the P-cores support AVX-512 and the E-cores do not, but instead of just ensuring that all threads that need it are routed to the P-cores, they just disabled the AVX-512 hardware in bios.

Thread Director makes naive assumptions. Workload not in user focus gets deprioritized. Dumb.
 

SuperMatt

Site Master
Posts
7,862
Reaction score
15,004
Looks like a nice place to visit if you’re ever near Tacoma:

iu


 

Pumbaa

Verified Warthog
Posts
2,564
Reaction score
4,220
Location
Kingdom of Sweden
Looks like a nice place to visit if you’re ever near Tacoma:

iu


Yes. Water cooling makes sense given the power numbers mentioned by @Cmaier.
 

leman

Site Champ
Posts
609
Reaction score
1,121
Ugh, yeah, I’m rather underwhelmed. Frankly, I don’t understand all the praises the reviewers are singing. So Intel managed to build a CPU core that’s slower than ARM cores from 2018 but still manages to consume more power. What’s the big deal?
 

SuperMatt

Site Master
Posts
7,862
Reaction score
15,004
Not sure, but I do see reviewers being awfully kind to Intel on multiple sites. Minor wins against current (soon to be previous) gen AMD in some benchmarks = crushing defeat. Weird choice of words from a site that I have considered neutral.

I think the big deal may be that Intel has actually managed to keep up and slightly beat AMD CPUs for the first time in quite a while, hence the big hype and the big news. It does seem though, judging by the consumption, that their little cores are not little enough.
People were talking in the 1990s about how they should consider moving on from x86… is it surprising they are being lapped by competitors in 2021? It feels like, instead of pursuing something new, they just keep trying to get that last drop of juice out of the 286 mark 10000.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,209
Reaction score
8,250

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,209
Reaction score
8,250
They did. Well, they tried, anyway. They had an amazing new 64-bit architecture that was going to set the world on fire. Unfortunately, it set itself on fire instead, producing mediocre results at too-high TPDs.

I prefer to think i had a personal hand in that, by helping to design the 64-bit architecture that actually did set the world on fire :)
 

Renzatic

Egg Nog King of the Eastern Seaboard
Posts
3,895
Reaction score
6,816
Location
Dinosaurs
They did. Well, they tried, anyway. They had an amazing new 64-bit architecture that was going to set the world on fire. Unfortunately, it set itself on fire instead, producing mediocre results at too-high TPDs.

Talking about Itanium? I vaguely remember people talking about it being the next big thing, then disappearing with nary a mention since.

As far as Alder Lake goes, this moreso than anything shows Intel's weakness in their designs. Yeah, it's a fast chip, but it's having to consume a ridiculous amount of power to push itself ahead of the competition. To use an analogy, it's like Intel took a decent enough car, tweaked the aerodynamics a bit, then welded a couple of jet engines on the side, all so they could say they have the fastest car in the world.

...yeah, it's fast, but it's not exactly long term feasible. Meanwhile, the competition is making cars that are smaller, considerably faster, and don't require high octane jet fuel to run.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,209
Reaction score
8,250
It did ultimately earn that nickname. It looked like they structured it to optimize its ability to emulate x86 as efficiently as possible but came up way short even in that.

The original titanium actually had an entire x86 core sitting on the die. A real bad one.
 

Yoused

up
Posts
5,508
Reaction score
8,679
Location
knee deep in the road apples of the 4 horsemen
John Salter over at ars seems to think the Alder Lake design is the bees knees, "crushing" Ryzen in benchmarks, that one of its big drawbacks is "Does not come with free kitten." I mean, kittens are cute and fluffy, but then you have to feed them and take care of them and, at last, try to cope with the cattitude they grow into.
 

Andropov

Site Champ
Posts
602
Reaction score
754
Location
Spain
The Thread Director using focus to put apps on the performance cores is honestly the dumbest thing I've read about this launch. I get that Intel does not have the control over the OS to implement something like GCD and QoS for processes as Apple does, but their solution is just... dumb. Maybe they rushed this launch to try to save face against Apple Silicon and that's why some things are half-baked (AVX512 support?).

I wonder what Intel has left for next year. Alder Lake is already in a new process node, with a new DDR version, a new core architecture and heterogeneous CPU cores. Since Intel used to update process node and core μarch every two years... what's left for 2022?
 

Yoused

up
Posts
5,508
Reaction score
8,679
Location
knee deep in the road apples of the 4 horsemen
Someone at ars posted a video link that showed an alder lake cpu getting a better (something) performance score at 35W than an M1 Max at 30W. Of course, if you adjust for the P/W value, the Max did get a better result per watt, whatever that means – and it looks like the 35W figure was not a package number, which appeared to be more like 44W.

The i9 appeared to be handicapped by having two cores disabled, so it was beating the Max with only 14 cores running. Well, sorta beating it, but not really. 14 cores trouncing 10 cores with a worse net level of performance. Yeah, maybe not so much.

And, of course, the video is entirely in Mandarin, so, good luck with that.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,209
Reaction score
8,250
Someone at ars posted a video link that showed an alder lake cpu getting a better (something) performance score at 35W than an M1 Max at 30W. Of course, if you adjust for the P/W value, the Max did get a better result per watt, whatever that means – and it looks like the 35W figure was not a package number, which appeared to be more like 44W.

The i9 appeared to be handicapped by having two cores disabled, so it was beating the Max with only 14 cores running. Well, sorta beating it, but not really. 14 cores trouncing 10 cores with a worse net level of performance. Yeah, maybe not so much.

And, of course, the video is entirely in Mandarin, so, good luck with that.
Ah, this is the “emulated result” thing I was just asked about, I guess?

Let’s play this game in reverse. Add more cores to M1 Max and increase the package power to match, and emulate the performance :)
 

leman

Site Champ
Posts
609
Reaction score
1,121
Someone at ars posted a video link that showed an alder lake cpu getting a better (something) performance score at 35W than an M1 Max at 30W. Of course, if you adjust for the P/W value, the Max did get a better result per watt, whatever that means – and it looks like the 35W figure was not a package number, which appeared to be more like 44W.

The i9 appeared to be handicapped by having two cores disabled, so it was beating the Max with only 14 cores running. Well, sorta beating it, but not really. 14 cores trouncing 10 cores with a worse net level of performance. Yeah, maybe not so much.

And, of course, the video is entirely in Mandarin, so, good luck with that.

The thing is, I do not doubt this. M1 doesn't not perform too well in Cinebench, and this has been discussed in detail before. In fact, if the benchmark is accurate, this means that the mobile ADL (with a comparable config) is likely to be at least 30-40% slower than M1 Pro/Max in demanding sustained CPU workloads. The full 200W+ 8+8 i9 ADL beats M1 Pro/Max in Cinebench multicore by a factor of almost 2.5x, and yet in many SPEC tests they show similar performance.
 
Top Bottom
1 2