I admit that
@Cmaier's enthusiasm is infectious. I'm used to him being strictly logical and rational, while occasionally displaying his trademark acerbic humor and wit. Apple must have done something remarkable if it gets a veteran CPU architect's enthusiastic attention.
Over at "the other place", he compared the time we are now living in to the computer wars of the 80s. Back then you could walk into a software store (yes, they actually existed) which had isles for a half-dozen computer systems, each running different operating systems, with substantially different underlying hardware architectures.
Then things got boring. Microsoft dominated with Windows. The classic Mac OS was religated to 1-2% marketshare mainly used for desktop publishing and graphic arts. Linux was nothing more than a curiosity, at best. RISC designs slowly disappeared from desktops and workstations.
Even as competition improved, CPUs didn't. It wasn't too long ago that Intel had stagnated at 4-cores because AMD wasn't competing, and nobody else challenged them. Apple seemed to have little interest in updating the Mac. I recall
TidBITS running an article about how the Mac was quickly becoming a device for older generations, and once those folks essentially died out, the Mac would likely go with them. The Mac would become a legacy product, a side project of the iPhone company.
Apple did the exact opposite, completely revitalizing the Mac with their own custom SoC, new industrial designs, and macOS getting a complete overhaul. AMD is back in the game, forcing Intel to innovate, including getting into graphics cards. New desktop ARM designs are coming from the likes of Qualcomm and Nvidia. Microsoft is trying new and interesting things with Windows.
The traditional desktop computer market hasn't been this exciting in decades, and it's great to see, after experiencing stagnation for so long.