On mine, consumers are not going to do this.
There is literally an entire community of consumers dedicated to running local machine learning models. I've read a bunch of people doing this. It's a popular topic on social media, believe it or not. And as I said in my analyses before , this is only the beginning. Apple made a major leap forward in democratizing access.
Is every consumer going to? No. Most Mac users don't even use 99% of Mac's features. But from set up to plug and play, Apple's RDMA over Thunderbolt is absolutely consumer use and consumers use it. You can check the plethora of social media posts since that's all that most people care to "talk" about these days -- "AI."
Also, whether you agree that $40,000 for an equivalent to $500K set up is consumer pricing or not, you're completely ignore one major point: you literally cannot buy H200 as a consumer. You can buy Macs and walk into an Apple Store and so yourself up with a consumer equivalent to enterprise server farms. So yeah, it's consumer regardless of opinion on pricing. Oh, and that little thing about memory and GPUs and SSDs becoming 4X more expensive for consumer PCs. At the end of the day, $40,000 is a lot more consumer grade in the context of literally everything than you actually are saying lol
Your "analysis" isn't. The consistent theme I get out of your posts here is that you're starting from a desired conclusion (which is always an extremely fanboyish take), and trying to spin reality to support it
Also to be clear, my analysis I was referring that I literally spent multiple hours researching, writing, and verifying regarding RDMA over Thunderbolt and how it changes the game, and is just the beginning for democratizing access to this for consumers is
not my earlier comment spanning 2 sentences. I'm otherwise ignoring this comment.
Apple's snit with nVidia was always (maybe after the first year or two) embarrassing and more harmful to Apple than nVidia. It's also ancient ridiculous history from a time when each company was a tiny fraction of its current size. It's long past time they got over it.
NVIDIA is literally does not care at all about the consumer market and never has. They only operated in it because it generated money, until the next thing came along that generated more money: enterprise servers for whatever bullshit cloud companies push.
Apple's "snit" is that NVIDIA puts zero effort for the end consumer beyond producing a product on spec sheet. This is clearly demonstrated from the time NVIDIA didn't give a damn at all about their chips in Macs causing massive issues, and it continues to this day: 1) horrible pricing, 2) NVIDIA refusing to put more than 32 GB in their products and being behind on nodes despite charging $2000, 3) ports literally melting for multiple generations of their top end GPUs, 4) missing entire parts of the fucking chip (ROPs), 5) NVIDIA claiming massive performance increases and using "AI" to cheat their way through it. Like seriously? They're a chip company, and they somehow manage to screw up every part of it. The only factual part of this snippet is that both companies are well bigger than they were at that point in time.
I have no idea how to respond to this. At this point it feels like willful ignorance. "Esoteric port"? It's Ethernet. (Well, not just ethernet, maybe, depending on what Apple chooses to support, as you could get infiniband too. And maybe will, if they really are using this hw.) And you keep on bringing up AI and nVidia, when that is entirely irrelevant. At the hardware level, this is about networking, not AI, even though AI code is the (presumably) major beneficiary of this. I did my best to explain this back in post #67.
Next time, please stick to addressing original claims made someone makes. You wondered if Apple would adopt QSFP in Macs for some reason related to PCC servers. And then wondered if they would adopt Mellanox switches in PCC. And then started moving into making a bit more definitive claims they would simply because they added Mellanox support in Mac at some point.
I simply replied with a really basic opinion: no, I don't think they're doing either (adopting QSFP on Mac /adopting Mellanox on PCC), and I'm only going off what I have read, rumor wise and Apple interview wise. As I said, my comment citing multiple pieces of information, and interviews as to why I said I didn't think so got erased, and I'm genuinely sorry that I didn't spend time rewriting the entire comment. It surely might have helped. There really doesn't need to be a whole debate about it. I'm allowed to say my opinion and my opinion can be wrong lol.
I
never suggested any of the following items: Apple will use Thunderbolt for server connections; QSFP is proprietary to NVIDIA; Mellanox is to do with AI computation or GPU related (beyond networking them).
I
directly addressed everything you,
@leman, and whoever else said with a short comment from the beginning:
1) I highly doubt Apple is going to add in what is basically very enterprise ports into this consumer product. It's just not something Apple chases. They are focused on Thunderbolt and for good reason
2) PCC has already been confirmed to be fully Apple silicon, Internal transformer models are trained via Google TPU hardware, primarily, with some NVIDIA.
And I later clarified that Broadcom is doing the networking, according to rumors. I could be wrong! Rumors are stupid!
But for purposes of discussion, we cant ignore that the critical detail that Broadcom, not NVIDIA no matter how many drivers Apple puts into Mac, is the widely rumored partner for networking. Oh, and the little fact that PCC servers are verified as stateless. The reason it's full Apple silicon and Apple is partnering with Broadcom on networking is because there's no way in hell NVIDIA is going to just give up all internal IP and full access to Apple to make sure PCC claims are verified. It must be full Apple silicon with Broadcom networking.
You and
@leman and
@mr_roboto can research for yourselves how Apple is partnering with Broadcom, not NVIDIA for networking; reason that if NVidia was even hinted at being involved with PCC that would be all over Wall Street commentary and news articles; that PCC servers make certain, guaranteed privacy claims, and come to a similar conclusion that it's highly unlikely NVIDIA is involved in PCC. I guess I made the cardinal sin of saying "I highly doubt" based on stuff I read.
I am a fan of what I've written comments about. I really like what achievements Apple is making, the progress they are making, and no one else has written on the impact it has. An example being Apple introduced RDMA over Thunderbolt for users, and I explored what possibilities that opened up. I am allowed to like what they're doing. I can analyze something, and demonstrate I'm a fan of it. That doesn't invalidate my points just because I like their work!
I'm ending my part of this discussion here. Thanks for reading my points if you made it this far.