Intel reports largest quarterly loss in company history.

My sense is that years ago Cook realized that with Apple silicon on the horizon for use in Macs, there would be a time in the future where Mac revenue would decrease as customers would be holding onto Macs for longer periods of time, due to their excellent performance being good enough for most people.
Apple Silicon has not really changed that, though. My best friend bought her iMac in 2017 and has no plans to upgrade until it starts throwing sparks or something. My Cube lasted almost a full decade. Macs are expensive, so owners want to get their money's worth. Fortunately, they also tend to be quite durable, so getting your money's worth is quite possible.
 
Apple Silicon has not really changed that, though. My best friend bought her iMac in 2017 and has no plans to upgrade until it starts throwing sparks or something. My Cube lasted almost a full decade. Macs are expensive, so owners want to get their money's worth. Fortunately, they also tend to be quite durable, so getting your money's worth is quite possible.

From Mac registration data and surveys, I think Apple has pretty good insight into who's purchasing (or not purchasing) new Macs. From both repeat customers as well as first time purchasers.

And with that and other information (economy, employment stats, etc) they can suss out why Mac sales are significantly down this quarter, and how/if AS plays a part.

The good news is Services now accounts for 22% of Apple's total revenue and helps cover the gap.
 
Well I guess its because the only truly compelling product they have really (outside of niche users who need CPUs fast at AVX512 or whatever special instruction intel have this week) is the ARC GPU (no, seriously bear with me).
I've said previously that Intel's Arc GPUs are the most interesting consumer product that Intel has produced in years. Their CPU releases have been moribund and uninspired. At least this is something different. Their CPU division won't become interesting until Arrow Lake, later in 2024, and perhaps won't really be substantive until Lunar Lake in 2025 or 2026. I hope that Intel continues to work on Arc, because the PC market could use a third GPU manufacturer, with Nvidia abusing its customers and AMD continually shooting itself in the foot. There's been a recent kerfuffle about Nvidia not including 16GB+ cards below their high-end RTX 4080, which is a truly legitimate concern, because many games already released utilize more than 12GB of VRAM, as has been extensibly covered in the tech press. The 4080 starts at $1,150 USD, currently on Newegg, which is out of almost every gamer's budget.

In response, AMD's Senior Director of Gaming Marketing, Sasa Marinkovic was bragging about the availability of AMD graphics cards with 16GB of VRAM starting at $499 for an RX 6800, which is a previous generation card announced in July of 2021, nearly two years ago.

AMD_16GB.jpg


While hardly a speed demon, an Intel Arc A770 with 16GB of VRAM can be had for $330 at Newegg. Intel has made considerable progress with its drivers since they originally launched Alchemist. The A770 could be a value graphics card, relatively speaking, with high-end 8GB products like the previous generation RTX 3070 choking on some of the latest games, as Hardware Unboxed has extensively covered. I think it would be pure insanity to buy anything less than a 16GB card in this market, unless a user absolutely needs Nvidia features. I'm surprised that Intel hasn't been hammering on this repeatedly, because it's a rare marketing win that team blue could use against their competitors. Plus, despite claims from the likes of MLID, Intel continues to invest in Arc, and has a roadmap for the next four years. It's anyone's guess whether they will stick with it, their track record isn't great, but driver improvements have been steady.

You simply need to follow his twitter. I did for a couple of days expecting something interesting but bible verse isn't what I follow tech CEOs for.
I realize that I may have inadvertently initiated a sidetrack on Gelsinger's apparent religiosity. This being the Technology subforum, the politics and religious discussions should be had elsewhere. Thankfully, the discussion has remained primarily on the tech aspects of Intel, but was in danger of veering off course. While this wasn't intentional on my part, and we generally don't rule with a heavy hand here at TechBoards, I regret my misstep and will strive to do better in the future. In this instance, nothing came of it, but I shall do my best to prevent any deviations on my part, and to be less lackadaisical as a moderator. I apologize to everyone for nearly taking us off into the hinterlands.
 
My sense is that years ago Cook realized that with Apple silicon on the horizon for use in Macs, there would be a time in the future where Mac revenue would decrease as customers would be holding onto Macs for longer periods of time, due to their excellent performance being good enough for most people.
Macs definitely retain their value much longer than PCs. My 2011 Mac mini held on for seven years, until the 2018 model was released, much longer than I had originally expected. As I have said elsewhere, my base model 2018 Mac mini was originally supposed to be a two-year "stopgap" until Apple Silicon was released, and I'm now on year five. As everyone and their pet muskrat knows, I've been rather troubled about needing a new computer, in the not too distant future. I'd typically be enjoying the rumormongering and resultant banter, but it's been more stressful than usual, whereas I would typically enjoy WWDC alongside everyone else here.

To compensate for that projected loss of revenue, Cook went full in on services, to the point where today it represents 22% of Apple revenue, and nicely fills in (and then some) for the Mac revenue downturn.
I suspect that, with considerable investment in new television and movie content, the margin on the Services division isn't as high as the more mature product categories. I see it as a long-term investment, which will be used to goose sales for Apple's other divisions, much as Amazon does with Prime. With Warner Bros. Discovery renaming HBO Max to simply "Max", I think that provides a gigantic opening for Apple to become the natural provider for premium content, compared to the shovelware that is found in Netflix, Hulu, and now apparently Max. The HBO brand had decades of quality associated with it, WBD is throwing that away, which gives Apple an excellent opportunity to fill that void. I don't currently subscribe to Apple TV+ (or any other streaming services), but would consider it, if Apple becomes the premiere content provider which I believe they wish to be.

As a side note, I'm going to address @throAU's post in this other thread, since it is related to this thread, and I didn't want parallel discussions going at the same time. This isn't to diminish what was written, for anyone who hasn't already, I suggest reading @throAU's thoughtful and insightful post, and reply as seen fit. I appreciate hearing from folks who see things from a different angle.

Additionally, Apple had been burned and had to go through CPU transitions multiple times in the past. If some new cpu manufacturer comes out with a revolutionary product that's on the open market, you can guarantee Apple will be running their product stack on it internally until they can either replicate its performance or beat it - or buy it.

I appreciate the historical perspective and Apple's initial timeline for Apple Silicon. For those who have followed my Mac vs. PC saga, I believe I may have poorly communicated about my expectations from Apple. One thing that I failed to emphasize was how difficult this transition is, both because Apple is moving to an all-new architecture with Arm, but utilizing the latest nodes from TSMC. That's not an easy task, and I think that Johny Srouji and his team inside of Apple's skunkworks have done a remarkable job, considering the challenges they faced, both from global events, but also the gargantuan engineering task that they had to undertake. Also, massive credit to the folks working on Rosetta 2, which is so seamless in integrating x86 program support, that it's easy to forget how much work it took to accomplish nearly universal compatibility. That compares to Microsoft's efforts with Windows 11 and Windows-on-Arm, which hasn't been nearly as efficient. So, not only has Apple succeeded with the hardware aspects of the transition, but software, as well.

Going back to my personal perspective, I think that Apple has done an amazing job with CPU performance with the P-cores, E-cores, memory bandwidth, component integration and various accelerators which arrived much sooner than anything from the PC suppliers. As a result of Apple's vertical integration strategy, implementation was also swifter than what could be expected from Microsoft and its partners.

The one aspect where I have been personally skeptical has always been the GPU. I remember discussing this with @leman over a year ago at the MacRumors forum, and our mutual concern that Apple would face difficulties in being competitive with midrange or higher graphics performance with standard PC desktops. I don't think anyone expects insane RTX 4090 levels of performance, but the recently released 4070 is highly efficient, and uses a tiny PCB. Ignoring Nvidia's obscene pricing, I think this is what Apple should be targeting for most of its desktop line. Teardown courtesy of Gamers Nexus.

4070Teardown.jpg



That's certainly the case for me with my M1 Studio and my modest needs. For me, it's good enough - and it will probably will be for the next few years.

While my bellyaching over this may seem excessive, it comes from a position of wanting Apple to succeed. Most of my concerns have centered around the delays of the Apple Silicon Mac Pro, but that product is symbolic of my plight in general. I will almost certainly not be purchasing the next Mac Pro, but will be looking at the mid-range desktop products that will follow it. The Mac Pro will inform us of Apple's plans for high-performance desktops going forward.

I am among the group that believes that Apple will not use AMD GPUs for graphics acceleration in the Mac Pro. I won't go into details on that again, but if nothing else, it goes against Apple's cultural norms. At the time that Apple Silicon was being planned, Nvidia had just screwed them over, and AMD had released the Vega series, which is hardly a standout generation. Those are just a couple of reasons why I believe Apple will stick with its own GPU architecture. When I asked @Cmaier about it last year, he stated that he believed that Apple "likes their architecture", and that there is a good chance that Apple will produce a separate GPU die for a later M-series generation. (This was before he apparently received insider info from one or more old buddies inside of Apple, so this wasn't a leak.)

If Apple does implement third-party graphics cards with the Apple Silicon Mac Pro, that would likely mean that eGPU support would come for other models, as well. (This is not to be confused with GPGPU compute, but specifically regarding accelerated display graphics.) I'm using an eGPU right now, the Black Magic RX 580, which was certified and co-designed by Apple. An eGPU is and always will be a kludge, so it fits in my "never again" category, being a pain to deal with ever since I purchased it. I don't think Apple will go this route, and instead focus on improving its own GPUs. If Apple did go crawling back to AMD for GPU chips, then that would signal to me that Apple has no confidence in its own GPU offerings, and therefore PC would win my business by default. I would very much not like for that to happen, but this is one of the few areas that I don't see any wiggle room.

My two main concerns have been:

1. Can Apple provide enough GPU horsepower to take on upper mid-range PCs? Everything else about Apple Silicon is impressive, but they have been lacking in this metric, thus far. I do have to keep reminding myself that Apple Silicon is only 1.5 generations in, with the M2 series not even complete.

2. Can the Asahi Linux team successfully get Proton games running on Apple Silicon? I have always said that I don't need access to all Windows PC games, just enough to keep me satisfied. My favorite genre of games are turn-based isometric RPGs, and as a result of some bizarre happenstance, essentially all of them have a Mac version, with Baldur's Gate 3 being the most anticipated "AAA" gaming title in that sphere. I cannot think of a single isometric RPG that I want to play that doesn't support the Mac natively. For that, I am fortunate, and that covers about 80% of the games that I play. If that wasn't the case, then I'd have to go PC by default, but that is not an issue. It's the other 20%, such as games developed by Remedy studios, which are a stickler. I'm hoping that the Asahi Linux team continues to make progress in this regard; they've already succeeded far beyond what I had expected.

If those two things can come together, namely a potent homegrown Apple GPU, along with Asahi Linux Proton games support, then I think I will have my answer. I was quite pessimistic last month, but after talking through it with the posters here, I have become more patient about letting the final acts play out, with the help and advice from my friends here in this forum.
 
By the way, I have one final note for those of you who don't want to read through my considerable essays on this subject. It's one of human psychology, not technology. After hearing the accounts from posters here about Pat Gelsinger's extracurricular activities in promoting his strong beliefs in ceiling cat, I wanted to share my own little tale. No, it has nothing to do with spreading "the good news", that's up to Gelsinger on his own.

CatGod.jpg


In my instance, it's purely about Obsessive-Compulsive Disorder. I have a minor case of it, always have, likely always will. When I was a kid, I had no idea what OCD was, but I had compulsions to collect "one of everything", whether that be trading cards, action figures, or pet rocks. My grandmother was always amazed at how I could sort my toy cars into the same position that I had them previously despite not playing with them for months. In fact, I'd set them up to the point where cobwebs would develop, simply because I didn't want to mess with their "perfect positioning".

This was nothing detrimental, just annoying, and it carried on into adulthood. Instead of dealing with obsessions and compulsions in Real Life™, I transferred that over to my computer gear. I have my operating system, software, and services set up "just right" in a way that fits with my OCD. My computer desk, equipment, and upkeep of that area are also symmetrical, clean, and always in good working condition. I'm also stickler about security, passwords, and updates.

In other words, I took all of that OCD and put it to some good use. It's made me more efficient with my workflow, I know where everything is, and non-tech family and friends will ask to use my Mac for making purchases, and other activities that require a secure environment, because they know it is safe.

That's where I am getting into trouble with the Mac vs. PC debate. The PC option rubs against my OCD in all the wrong places. Microsoft Windows is a chaotic disaster, PC components are a hot mess, with two dozen adult Legos housed inside an ugly see-through case which are held together with sticks, bubble gum, and ceiling cat prayers. Windows is a mess of legacy code, with visible elements dating back to Windows 3.1, still fully functional, unsightly as they are. x86 chips are inefficient, with yet more legacy crap lurking under the hood, making it even more inefficient. This doesn't even touch on all of the shady practices done by Microsoft, Intel, AMD, and Nvidia. Apple isn't perfect, but I don't feel like the company actively hates me.

That's certainly the case for me with my M1 Studio and my modest needs. For me, it's good enough - and it will probably will be for the next few years.

Which brings me back to my solution, or at least what I hope is my solution. I want Johny Srouji and his team to release desktops with potent GPU cores, custom Apple Silicon, not recycled mediocrity from AMD. Something around 4070 non-Ti performance. Lay on the GPU cores, nice and thick. Give me all the cores! With that, I can use Asahi Linux to hopefully play some Windows-only games, and not just titles from a decade ago that run on CrossOver or Parallels. I like Linux, appreciate it for what it is: a cousin of macOS. It doesn't make my skin crawl like Windows, so I'd be happy to install it alongside macOS Alcatraz, which doesn't upset my OCD.

Also, I've mentioned before that my Mac is both a tool and a hobby. If I were to switch to PC, then the hobby aspect goes away, and it's my favorite hobby, leaving me with nothing more than an unsightly tool. Part of my Mac hobby is visiting this forum and chatting with my friends here. TechBoards is the most delightful place on the internet, a refuge from the daily grind, unlike the wretched hive of scum and villainy that are the MR forums. My Mac is the only Apple product I own, and if I give that up, my presence here will become obsolete. I very much don't want that.

Finally, I'm going to leave this long-winded series of posts with this thought. On a certain level, the Mac tickles my OCD in a way that nothing else in the technology industry can. Apple's vertical integration strategy is part of that.

There is a certain elegance to running the last consumer UNIX operating system on top of the only mainstream RISC processor. I don't know if that makes any sense, but I think it has an intangible appeal to it. I wonder if I'm the only person who sees the Mac this way?
 
Getting back to the latest earnings, I think we could use some context for those "significantly lower" Mac revenue numbers from Apple's latest earnings report.

The Mac division still makes +30% more revenue than it did pre-pandemic. Apple's Mac has more revenue than Intel's entire Client Computing Group, which includes PC desktops, laptops, chipsets, and GPUs.

Mac Revenue ($) Year-Over-Year for the current quarter:

2017 - $4.2 billion
2018 - $4.1 billion
2019 - $5.5 billion
2020 - $5.4 billion
2021 - $9.1 billion
2022 - $10.4 billion
2023 - 7.2 billion

Here are Intel's revenue numbers for their Client division, again Year-Over-Year for the most recent quarter:

2017 - $7.9 billion
2018 - $9.0 billion
2019 - $8.6 billion
2020 - $9.8 billion
2021 - $10.6 billion
2022 - $9.3 billion
2023 - $5.8 billion

We can also compare AMD's Client revenue over the same period:

2017 - $0.6 billion
2018 - $1.1 billion
2019 - $0.8 billion
2020 - $1.4 billion
2021 - $2.1 billion
2022 - $2.8 billion
2023 - $0.7 billion

Now, let's look at the gross margin for each company from last quarter:

Intel - 34.1%
AMD' - 44.0%
Apple - 44.3%

Keep in mind that these are the percentages for all product divisions, not just Client. While we don't have those exact numbers, analysts have long estimated that the profit margins on the Mac are substantially higher than for PC manufactures. I've seen estimates that while Apple comes in fourth place in the global personal computer market, Apple may take as much as two-thirds of the profit, but those figures are from analysts, not the relevant companies. Obviously, Apple sells an entire computer, while Intel and AMD are selling the building blocks.

That being said, Apple's Mac division now produces more revenue and profit than Intel's entire Client Computing Group, as well as AMD's division, combined.

This is a dramatic reversal for those of us who remember the bad old days, when the Mac had about 1% market share and in 1997 Wired magazine ran with this cover:

applepray.jpeg


In the financial conference call with analysts, Apple stated that customer satisfaction with the Mac is 96% according to their own internal research. They also stated that Mac ownership is at an all-time high. Mac users are clearly very loyal to their platform of choice.

Furthering this notion is that Safari is now the second most used desktop web browser, surpassing Edge, second only to Chrome. Much of this can be attributed to the decline of Windows PC sales, and increased usage of the Mac as a primary desktop or laptop computing device. While Chrome still dominates, Edge being the default on Windows machines is no longer the benefit it once was, having been displaced by Apple's Safari browser.


In summation, while the tech press is going to look at the Mac's near-term drop in revenue for last quarter, what they likely won't highlight is the simple fact that the Mac is now more profitable, and takes in more revenue, than the equivalent divisions from Intel and AMD combined.

The gamble to move to Apple Silicon has been without question, an absolute success. There's no other way to interpret it. For an individual user, there are valid reasons to be critical of Apple and its decisions, but for the Mac as a platform, Tim Cook and company have unquestionably made the right call and should be commended for taking a bold risk that has benefited customers, shareholders, pushed the technology industry as a whole, Apple as an institution, and the Mac as a revitalized product category.
 
2017 - $7.9 billion
2018 - $9.0 billion
2019 - $8.6 billion
2020 - $9.8 billion
2021 - $10.6 billion
2022 - $9.3 billion
2023 - $5.8 billion

Stil... that's a significant drop over one quarter, and no doubt being analyzed by Apple.

Now, let's look at the gross margin for each company from last quarter:

Intel - 34.1%
AMD' - 44.0%
Apple - 44.3%

Even though Apple's GPM (gross profit margin) for the last few years has increased from around 30% (where it has been for a long time) to where it is now, it is largely driven by Wall Street expectations, assuming you're a listed company, and of course company needs. It can neither be too small or too large.

For the small private chip company I was a part of, GPM was largely driven by needing to make a successful business case covering overhead and R&D costs, paying decent salaries/benefits etc, and expected level of systems engineering support for customers (which was often a lot as the tech was relatively new in industry at the time), etc. And chip yield and order quantities.

Our GPM would typically run roughly between 60 and 90 percent. Our first digital receiver chip was priced at $300 in quantities 1-10, $86.50 at qty 500, and $29.83 in qty 10,000 (we never had an order that large :) ). Most of our customers at the beginning were defense/aerospace related.
 
Last edited:
Wowzers... AAPL is up 4.7% this morning. I'm guessing that might be due to Cook's yesterday afternoon announcement there will be no mass layoffs at Apple, along with their quarterly report.
 
Going back to my personal perspective, I think that Apple has done an amazing job with CPU performance with the P-cores, E-cores, memory bandwidth, component integration and various accelerators which arrived much sooner than anything from the PC suppliers. As a result of Apple's vertical integration strategy, implementation was also swifter than what could be expected from Microsoft and its partners.

The work on this started with grand central dispatch which I think came out in 2009? Something like that. I think it was in snow leopard.

But yes. Back then Apple rewrote the cpu scheduler and how the OS handles threads. They encouraged use of their shiny new scheduler library for multi threaded apps and libraries. Specifically to better make use of multiple (and we didn’t know at the time probably heterogenous) cores.

Because they “Make the entire widget” they can plan ahead in software for future possible hardware developments and vice versa whereas Microsoft for example are purely reactionary to what was put out by the hardware vendor.

And this is where intel is failing with their e core stuff. The flip side of the Microsoft problem.

Not only did they just raid the parts bin and stick two cores with different instruction sets together, they’re relying on Microsoft for scheduler support. And Microsoft are playing games trying to force people onto win11 to get the new scheduler. Customers are reluctant and so the performance optimisation isn’t there.

Apple did an amazing job but it wasn’t due to executing a plan quickly, it would appear that they likely started planning for this 10-15 years before Microsoft did and wrote a bunch of code to help developers seamlessly support this.

Can Apple provide enough GPU horsepower to take on upper mid-range PCs? Everything else about Apple Silicon is impressive, but they have been lacking in this metric, thus far. I do have to keep reminding myself that Apple Silicon is only 1.5 generations in, with the M2 series not even complete.

I think so. Resident evil village runs pretty damn well on my M1 Pro. Divinity original sin 2 runs fairly well on my iPad Air. I don’t say this as a console peasant being wowed by it either, i have an all SSD gaming PC with 5900X + 6900XT in it (along with a PS5).

I think the GPU is more powerful than some think, it just needs software written to take advantage of it.

Now both macos and iOS are on the same family of SOC, combined the apple GPU has a larger share of the potential gaming market than all other players, with decent software libraries (I.e. metal across 100M+ new devices annually just in iPhone) to go with it.

I think apple is just fine with their GPU strategy. They’re playing the long game.

Right now the pc market is seeing issues with VRAM capacity in the 10+ GB range. The actual code for a game is relatively small it’s mostly video and audio content. Even a baseline M1 Pro system has 16 gb of which 2/3 or more can be used as vram. Never mind the systems with more unified memory. Unified memory kills copying from cpu memory to vram. But it also opens up huge vram capacity.

Hogwarts for example runs better on ps5 than a lot of upper-mid range nvidia cards right now due to the unified memory architecture having sufficient vram. The Mac will have similar advantages.
 
Last edited:
Intel are rearranging the deck chairs with their CPU marketing. It's definitely going to be much simpler.

To make up for the words it's removing from its branding, Intel is also adding a new one. Its mainstream processors will now be known simply as Core 3, Core 5, and Core 7. But its high-end chips will be called Core Ultra 5, Core Ultra 7, or Core Ultra 9. The company didn't go into detail about what makes a processor Ultra, though we do know that Core 3 chips cannot be Ultra, while Core 9 chips can only be Ultra.
intel-ultra-03-scaled.jpeg



So what would have been an "Intel Core i7-14700K Processor" might now be an "Intel Core Ultra 7 processor 14700K" (Intel says it prefers the word "processor" to sit in between the "Core Ultra 7" part and the CPU's model number).

I've heard criticism that Apple's naming scheme for the Mac doesn't properly represent their chips and is too complicated. I'm not sure what is so hard to understand about M(x), Pro, Max, Ultra. It's like folks want them to be named Easy, Normal, Hard, Challenging, as if they need a difficulty scale.

Regardless, Intel has clearly solved its branding issues and are no doubt making it easier for potential customers to choose one of their fantastic products.
 
Jebus. This makes me think of when I used to scoff at the branding folks for such gems as: Windows Phone 7 Series

Or that Windows Mobile 6 came as:

Windows Mobile 6 Standard
Windows Mobile 6 Professional
Windows Mobile 6 Classic

Nobody other than OEMs cared which was which, so strange to brand it like that publicly. And good luck figuring out what the difference is without looking it up on Wikipedia. :P
 
Personally I always found the Xeon branding to be extra confusing - so many SKUs with so many little variations and of course the metallic “platinum” vs “gold” vs … levels.
 
Rebrands seem to work for Intel!

Changing "10nm++ Enhanced SuperFin" to "Intel 7" convinced a lot of people that Intel manufacturing is back on track.

Also, not branding as such, but the bull💩 TDP ratings on their chips trick people into believing their efficiency problem isn't as bad as it really is. Like "Wow, this 45W TDP chip is fast!" ignoring the 115W long-term boost state...

Anyway, enough cynicism! This branding must be a level of genius that a lowly fan of a "lifestyle company" like myself can't comprehend!
Sign me up for one of those sweet Intel Core Ultra 9™ processors with Iris Xe™ graphics and EVO™ technology.
 
Here are a few product matrix charts to compare Intel and Apple.

First, this is the official chart for Intel's current 13th gen desktop line:

Intel13Desktop.jpg


Intel's HX series for performance laptops:

IntelHX.png


Last from Intel, the U-series for thin and light laptops:

IntelU.jpeg


Now, for comparison, this is the entire product matrix for every Mac currently sold by Apple:

MacMatrix.jpg


I think the charts speak for themselves. I don't know why some folks think that the M-series designations are too complicated. Intel does provide more options...a lot more options, I'll give them that, I suppose.
 
Here are a few product matrix charts to compare Intel and Apple.

First, this is the official chart for Intel's current 13th gen desktop line:

View attachment 24390

Intel's HX series for performance laptops:

View attachment 24391

Last from Intel, the U-series for thin and light laptops:

View attachment 24392
Intel’s primary customers, even in the consumer market, are OEMs who are looking for the exact chip to put into their product.

Now, for comparison, this is the entire product matrix for every Mac currently sold by Apple:

View attachment 24393

I think the charts speak for themselves. I don't know why some folks think that the M-series designations are too complicated.

It’s the 13” MacBook Pro hanging off the chart like that - fuck that guy. 🙃 The one thing I’ll concede about the Mac lineup names though is that Pro/Max and Mac in various combinations can be a mouthful or repetitious … it’s not my favorite naming scheme but it works and could be so much worse, so whatever.

Intel does provide more options...a lot more options, I'll give them that, I suppose.

Indeed.
 
Intel’s primary customers, even in the consumer market, are OEMs who are looking for the exact chip to put into their product.
I know. I'm mainly clowning on Intel's naming schemes. They have some byzantine method of distinguishing between a 13600K and a 13400F, but I'm still not sure what it is. Back when I was researching PC parts, I figured out that the "K" was for enthusiasts who like to overclock, while the "F" means no integrated graphics. The significance of the numbers, beyond being 13th gen, continues to elude me. Apparently, 14th gen is going to be based upon both Raptor Lake and Meteor Lake, which just adds to the churn.

When I first started researching a plan "B" three years ago, back when I was uncertain about the future of Apple Silicon, I had to learn all of this crap. I vaguely knew what the Intel and AMD parts were, since they were used in the Macs of that era, but Nvidia's offerings were entirely new to me. While I'm putting a spotlight on Intel, the product models for AMD and Nvidia aren't any better.

Should I get the 13900KS or the 13900K? Perhaps a RX 7900 XTX or RX 7900 XT? Maybe an RTX 4070 Ti or RTX 4070? Mayhap an 7900X3D or 7950X3D? I'm a tech head and it took me a good while to divine the true nature of these arcane glyphs. I can't imagine what an average consumer would think.

I'm sure Intel has their reasons for a multitude of SKUs, I'm just glad that I don't have to deal with it anymore. With my decision to firmly stay within the Apple camp, I can go back to my blissful ignorance of all things PC. The difference is that I won't have to concern myself with any of the PC manufactures parts, because Apple Silicon is the only way forward on the Mac. Also, I don't have to think about Windows anymore, which is an added bonus.

The one thing I’ll concede about the Mac lineup names though is that Pro/Max and Mac in various combinations can be a mouthful or repetitious … it’s not my favorite naming scheme but it works and could be so much worse, so whatever.
You mean going into an Apple Store and asking for an M2 Max Mac MacBook running macOS doesn't roll off the tongue? I think what's important is that Apple is able to distinguish the product line in a way that average consumers can reasonably understand. I'm not really sure what they could have done to make it simpler, but I don't claim to be a marketing genius.
 
I know. I'm mainly clowning on Intel's naming schemes. They have some byzantine method of distinguishing between a 13600K and a 13400F, but I'm still not sure what it is. Back when I was researching PC parts, I figured out that the "K" was for enthusiasts who like to overclock, while the "F" means no integrated graphics. The significance of the numbers, beyond being 13th gen, continues to elude me. Apparently, 14th gen is going to be based upon both Raptor Lake and Meteor Lake, which just adds to the churn.

When I first started researching a plan "B" three years ago, back when I was uncertain about the future of Apple Silicon, I had to learn all of this crap. I vaguely knew what the Intel and AMD parts were, since they were used in the Macs of that era, but Nvidia's offerings were entirely new to me. While I'm putting a spotlight on Intel, the product models for AMD and Nvidia aren't any better.

Should I get the 13900KS or the 13900K? Perhaps a RX 7900 XTX or RX 7900 XT? Maybe an RTX 4070 Ti or RTX 4070? Mayhap an 7900X3D or 7950X3D? I'm a tech head and it took me a good while to divine the true nature of these arcane glyphs. I can't imagine what an average consumer would think.

I'm sure Intel has their reasons for a multitude of SKUs, I'm just glad that I don't have to deal with it anymore. With my decision to firmly stay within the Apple camp, I can go back to my blissful ignorance of all things PC. The difference is that I won't have to concern myself with any of the PC manufactures parts, because Apple Silicon is the only way forward on the Mac. Also, I don't have to think about Windows anymore, which is an added bonus.


You mean going into an Apple Store and asking for an M2 Max Mac MacBook running macOS doesn't roll off the tongue? I think what's important is that Apple is able to distinguish the product line in a way that average consumers can reasonably understand. I'm not really sure what they could have done to make it simpler, but I don't claim to be a marketing genius.
Aye and if you think Intel’s consumer parts are named badly with too many SKUs, as I mentioned before, you should see the Xeons. God damn. I got a headache once trying to go through their various options just to see if I could make sense of them.
 
Now, for comparison, this is the entire product matrix for every Mac currently sold by Apple:

MacMatrix.jpg

For completeness, since you are comparing Apple to Intel, you ought to include the A16 and A15, since they are used in current products; also, the A10 appears in the base model iPad.

Intel’s primary customers, even in the consumer market, are OEMs who are looking for the exact chip to put into their product.

If you wanted to put an ARM SoC in your device, how difficult would it be to select an appropriate model?
 
For completeness, since you are comparing Apple to Intel, you ought to include the A16 and A15, since they are used in current products; also, the A10 appears in the base model iPad.
It's just a straight up bananas to bananas comparison between Intel's desktop/laptop offerings and the Mac line. I could have included the Xeons, but Apple doesn't have a workstation/server line, and the M2 Ultra inside the Mac Pro doesn't count, since it's also in the Mac Studio. I don't consider the iPad to be a desktop computer, which include the M-series, as well. It's meant to be a rough comparison which concerns old-timey traditional desktop computer chips, nothing more.
 
Back
Top