Charlie is full of it, lol
Those Samsung models are shipping early and limited to 2.5GHz, I suspect a firmware thing that’ll be lifted. It runs contrary to their own advertisement about the clocks too so it’s not a “the core is actually fake BS” thing RE: IPC, it’s a firmware issue. The review from the same guy on Reddit said he felt responsiveness was still top notch and blew everything else (including his MTL PC) out or matched his Mac.
If I had a dollar for every single “QUALCOMM LYING” accusation that turned out too soon and full of it or a half-truth by now from very motivated anti-fans, I’d be a very wealthy guy. People really, really want this thing to fail. I have bad news for them!
		
		
	 
	
		
	
	
		
		
			Ofc if there’s something else going on and the chips are actually capped to 2.5GHz, that’d be terrible, but he said in both power efficiency and plugged in modes it was capped, which strongly makes me suspect it’s just a Samsung firmware thing.
Charlie exaggerates for clout or tells half truths, which is what I expect is going on here even if he knew this was a weird issue.
Now again if it’s actually true they are literally lying and selling 2.5GHz instead of 4GHz X1E-80 chips that’s terrible and he’d be right but I doubt it.
		
		
	 
Don't get me wrong, I don't think this is a silicon issue, I mean the CPU design looks intrinsically good and as one would expect incredibly similar to M1/2 in many important respects and still good where it isn't similar to that design. But these results are bizarre - not even all the Qualcomm chips in the Samsungs are behaving this way (some of the Galaxy books with the same apparent chip types as the offending ones have the expected 2700-2900 scores) and it isn't just limited to Samsung, here's 
Asus and to a lesser extent 
Lenovo. Now for any GB score I can find a few outliers for any chipmaker where someone clearly did something wrong, be it the user running stuff in the background or it's a lemon that somehow escaped validation testing, but I've never seen 
this. I can agree that, from little I've seen, Charlie is prone to flares of hyperbolic cynicism, but in fairness he also said that the silicon itself wasn't the issue - though if memory serves even though he was accusing Qualcomm of lying, he was blaming MS for the performance problems through unspecified means which I'm not sure if that tracks with what we are seeing, maybe sorta?. If this is a firmware issue it's a pretty bad one and very oddly stochastic.
	
		
	
	
		
		
			Yeah, everyone knows Qualcomm’s compute drivers suck, and I think also the architecture itself is largely targeted towards graphics.
		
		
	 
To me it's more of an intriguing puzzle, what is it about the architecture that suffers under compute loads? I really liked chipsandcheese's hypothesis of the GMEM caches being geared towards graphics only with little left over for anything else but now Anandtech's article seems to dispute it and that annoys me greatly. 

 Also I have to admit that I think the M1/2 GPU cache situation wasn't great either if I remember chipsandcheese's article on that but obviously they don't suffer in compute as badly giving maybe more weight to driver issues? Having said that, chipsandcheese's cache hypothesis is based on their testing, while Anandtech is breaking down engineering slides from Qualcomm and I can't quite remember how the M1/2 cache compared so I may be wrong about that.
	
		
	
	
		
		
			I’m fine with this, because for now games and DirectML is all it’d be used for and the performance with those is fine at least by comparison. And on phones compute performance is really nbd.
That said, down the line Adreno will need to evolve and get a better for compute and they’ll have to improve drivers. Even on basic ultrabook laptops it would be a good idea
		
		
	 
Absolutely.
	
		
	
	
		
		
			Also didn’t realize you guys knew who Longhorn was, that’s cool.
		
		
	 
Yup!
	
		
	
	
		
		
			IMO Apple’s iGPUs, then maybe Arm’s, then Intel’s new ones are in order the best iGPUs from a holistic perspective of graphics/compute capability
Nvidia we haven’t seen yet or in a while rather but I suspect theirs will take near the top
		
		
	 
I'm hoping for a MediaTek-Nvidia M3 Max analog, even better would be in a desktop form factor so I can also use a dGPU as well for development. Basically I'd focus on the integrated GPU for development but having a dGPU as well would be nice for testing. That's probably hoping for a bit much, it'll probably be laptops only, but even with a laptop I could maybe run the discrete GPU as an eGPU if it is just for testing and development purposes. But my ideal would be a medium sized desktop.