For comedic value here is an Apple rumor YouTuber

Colstan

Site Champ
Posts
822
Reaction score
1,124
My project for this weekend is to integrate a new KVM switch into my setup.
This has been a giant bugbear for me. My 4K LG UltraFine uses USB-C to connect to a Mac. However, it does work with DP Alt Mode, but requires a very specific cable. I've got the proper DisplayPort adapter to work with a PC, but the idea of manually switching the cable between the Mac and PC is simply a pain. About a year ago, I couldn't find a KVM that was suitable, but perhaps that has changed. I'll have to look into it. I realize that it would be easier to simply use a standard PC monitor, but I'm OCD about having a "Retina" display for my Mac, since sub-pixel anti-aliasing went the way of the woolly mammoth after High Sierra.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
I7-13700, RTX 4070, 32GB. It ought to keep me going for quite some time.
That's not bad for a midrange desktop build. I'd have a really hard time choosing between the 13700K and the 7700X. I think the same could be said for the 4070 Ti and the 7900 XT. Right now, I'd probably be leaning slightly toward an AMD build, but I don't think either combination is bad, just different tradeoffs. Regardless, I hope you enjoy your new gaming boxen!
 

Nycturne

Elite Member
Posts
1,111
Reaction score
1,426
That's not bad for a midrange desktop build. I'd have a really hard time choosing between the 13700K and the 7700X. I think the same could be said for the 4070 Ti and the 7900 XT. Right now, I'd probably be leaning slightly toward an AMD build, but I don't think either combination is bad, just different tradeoffs. Regardless, I hope you enjoy your new gaming boxen!
If you’d be looking at a 7700X and 7900XT for a gaming rig, my personal vote would be save the 100$ on the CPU with a 7600X and put the 100$ towards a 7900XTX right now (or your pocket).
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
If you’d be looking at a 7700X and 7900XT for a gaming rig, my personal vote would be save the 100$ on the CPU with a 7600X and put the 100$ towards a 7900XTX right now (or your pocket).
I was trying to do an apples to apples comparison with the specs @MEJHarrison provided. Personally, if I were doing a build today, I'd wait a couple more weeks for the 7800X3D and pair it with a 7900 XTX. That's a bit higher up the price scale, but PC parts are already expensive, so I figure that I'd spend a bit more for the top tier gaming parts.

I tend to hold onto my gear for a long while, so an extra ~$400 isn't a big deal, for a better CPU and GPU, over many years. That being said, at least AM5 allows for a potential upgrade to Zen 5/6, unlike Intel constantly playing musical sockets.
 

Nycturne

Elite Member
Posts
1,111
Reaction score
1,426
Yeah, it’s just that adding CPU cores tends to be one of the weakest ways to improve gaming perf on the AMD side with Zen 4. And it only gets more GPU-bound as you go to higher resolutions. Not many are going to be doing 1080p at 300+ fps.

Mostly just sharing as someone who has recently gone through building an AMD rig.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
Yeah, it’s just that adding CPU cores tends to be one of the weakest ways to improve gaming perf on the AMD side with Zen 4. And it only gets more GPU-bound as you go to higher resolutions. Not many are going to be doing 1080p at 300+ fps.
I would say that, unless you are using a 4090 and money doesn't matter, that a regular non-X 7600 or 13500 is plenty enough to bottleneck most GPU-bound games these days. While GPU is always going to be king, Hardware Unboxed has done numerous tests on various components, and found that even things like DDR5 timings and whether Resizable BAR is enabled can have a substantial impact in some games. (Turning off ReBAR actually shows an increase, at times.) That's why I think it's better to slightly over-spec, particularly if the user plans on keeping the build unchanged for many years.
 

MEJHarrison

Site Champ
Posts
873
Reaction score
1,710
Location
Beaverton, OR
That's not bad for a midrange desktop build. I'd have a really hard time choosing between the 13700K and the 7700X. I think the same could be said for the 4070 Ti and the 7900 XT. Right now, I'd probably be leaning slightly toward an AMD build, but I don't think either combination is bad, just different tradeoffs. Regardless, I hope you enjoy your new gaming boxen!

If we had talked a few weeks back, I would have been all over this advice. But by the time I brought it up here it was too late. I'll obsess over the specs and every minor detail for weeks (and already have at this point). But once I jump, I'm committed and don't look back. Since I already had a motherboard on the way when I posted, it was too late for regrets. :D

With that said, I now have a CPU, CPU cooler and keyboard. The final pieces are expected to arrive on Sunday. So hopefully by Sunday evening, I'll have a working PC and will be trying out some PCVR.

The only piece I won't have by Sunday is the wireless device to hook up to the Quest (D-Link Air Bridge). So until that, I'll be going wired. Which is fine. That will give me a few days to learn all the in's and out's before diving into getting a wireless connection going.

Also, this is absolutely the wrong place to post this. Sorry!
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
If we had talked a few weeks back, I would have been all over this advice. But by the time I brought it up here it was too late.
I think you'll enjoy your new gaming machine, with the variables I mentioned being personal preference.

I'll obsess over the specs and every minor detail for weeks
You're lucky, I do it for years.

But once I jump, I'm committed and don't look back. Since I already had a motherboard on the way when I posted, it was too late for regrets.
Which is the best way to look at it. There's no such thing as a "perfect build". If you're buying a Mac, then it's pretty easy to get it right, with a limited configuration and price point. With custom PCs, there's a ridiculous number of options, and at some point you have to pull the trigger. As Steve Jobs said: "Real artists ship".

With that said, I now have a CPU, CPU cooler and keyboard. The final pieces are expected to arrive on Sunday. So hopefully by Sunday evening, I'll have a working PC and will be trying out some PCVR.
I hope it works out and feel free to keep us updated.

Also, this is absolutely the wrong place to post this. Sorry!
No worries, we all tend to stagger around like drunken frat bros around these parts. Besides, @Cmaier and myself are in charge of this forum, we both gave your post a like, so we clearly don't care.
 

MEJHarrison

Site Champ
Posts
873
Reaction score
1,710
Location
Beaverton, OR
Which is the best way to look at it. There's no such thing as a "perfect build". If you're buying a Mac, then it's pretty easy to get it right, with a limited configuration and price point. With custom PCs, there's a ridiculous number of options, and at some point you have to pull the trigger. As Steve Jobs said: "Real artists ship".

I used to obsess over things much more in my younger days. But back then money was a lot tighter. I guess that's one advantage of getting old. :D
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
More fun from MacRumors.


The initial version of the A17 Bionic chip will reportedly be manufactured using TSMC's N3B process, but Apple is planning to switch the A17 over to N3E sometime next year.

The rumor comes from a Weibo user who claims to be an integrated circuit expert with 25 years of experience working on Intel's Pentium processors.

Convincing source.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
It’s possible … but … 🤷‍♂️
@Cmaier could speak to the challenges of switching from N3B to N3E far better than I ever could. From what admittedly little I understand, Apple uses big ass caches (using a highly technical term) and N3B has a 5% density advantage compared to N5, while N3E offers no change. The main benefit to switching to N3E is the cost sharing involved with other companies, because Apple is likely the only major customer for N3B. Also, TSMC may want to switch everyone over to N3E so that they don't have to worry about two parallel lines. Again, from what little I understand, doing such a switch isn't a trivial or inexpensive process.

I put this rumor in this thread because of the comedy of MacRumors quoting a Weibo source. After WWDC, we can see that Apple has succeeded in locking down the leakers. As you stated in your epic rant, Gurman was dead wrong about his predictions. It was only a few days before WWDC that he vaguely admitted that the Mac Studio would get an M2 update. He completely missed the Mac Pro, insisting it would be later this year. Kuo lost the plot a long time ago. Even Ross Young, who I don't think is a hokum peddler, has fallen flat with his iMac and new desktop mini-LED monitor predictions. Now that WWDC is over, we're back in silly season. Perhaps Apple will utilize N3E, I'm not an expert, I don't know, I just question whether some random dude on Weibo knows the truth.
 

dada_dave

Elite Member
Posts
2,071
Reaction score
2,050
@Cmaier could speak to the challenges of switching from N3B to N3E far better than I ever could. From what admittedly little I understand, Apple uses big ass caches (using a highly technical term) and N3B has a 5% density advantage compared to N5, while N3E offers no change. The main benefit to switched to N3E is the cost sharing involved with other companies, because Apple is likely the only major customer for N3B. Also, TSMC may want to switch everyone over to N3E so that they don't have to worry about two parallel lines. Again, from what little I understand, doing such a switch isn't a trivial or inexpensive process.

I put this rumor in this thread because of the comedy of MacRumors quoting a Weibo source. After WWDC, we can see that Apple has succeeded in locking down the leakers. As you stated in your epic rant, Gurman was dead wrong about his predictions. It was only a few days before WWDC that he vaguely admitted that the Mac Studio would get an M2 update. He completely missed the Mac Pro, insisting it would be later this year. Kuo lost the plot a long time ago. Even Ross Young, who I don't think is a hokum peddler, has fallen flat with his iMac and new desktop mini-LED monitor predictions. Now that WWDC is over, we're back in silly season. Perhaps Apple will utilize N3E, I'm not an expert, I don't know, I just question whether some random dude on Weibo knows the truth.
Absolutely. I think N3E is also supposedly cheaper to manufacture with better yields but I don’t think that necessarily implies that switching nodes mid manufacturing makes sense. And you are quite right about the quality of the source.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,215
Reaction score
8,264
@Cmaier could speak to the challenges of switching from N3B to N3E far better than I ever could. From what admittedly little I understand, Apple uses big ass caches (using a highly technical term) and N3B has a 5% density advantage compared to N5, while N3E offers no change. The main benefit to switching to N3E is the cost sharing involved with other companies, because Apple is likely the only major customer for N3B. Also, TSMC may want to switch everyone over to N3E so that they don't have to worry about two parallel lines. Again, from what little I understand, doing such a switch isn't a trivial or inexpensive process.

I put this rumor in this thread because of the comedy of MacRumors quoting a Weibo source. After WWDC, we can see that Apple has succeeded in locking down the leakers. As you stated in your epic rant, Gurman was dead wrong about his predictions. It was only a few days before WWDC that he vaguely admitted that the Mac Studio would get an M2 update. He completely missed the Mac Pro, insisting it would be later this year. Kuo lost the plot a long time ago. Even Ross Young, who I don't think is a hokum peddler, has fallen flat with his iMac and new desktop mini-LED monitor predictions. Now that WWDC is over, we're back in silly season. Perhaps Apple will utilize N3E, I'm not an expert, I don't know, I just question whether some random dude on Weibo knows the truth.

I don’t even understand the rumor. Is the theory that they are going to tape out a chip in N3B and then re-tape the same thing out for N3E? No.

Could they use both processes but for different chips for different products? Of course.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
I don’t even understand the rumor. Is the theory that they are going to tape out a chip in N3B and then re-tape the same thing out for N3E? No.

Could they use both processes but for different chips for different products? Of course.
From my reading, the rumor claims that the A17 will start out on N3B, then Apple will replace it with a functionally identical (not physically identical) model on N3E. Or at least close enough that they could market it that way. It's fuzzy, I could certainly be wrong, which is why I lumped it in with silly season.

Putting the actual rumor aside, I guess what I'm personally interested in is the longevity of N3B, and Apple's use of it. Is this a short-term solution for the A17, with followup 3nm chips (M-series) moving to N3E, or are we likely to see Apple continue to use N3B for all of its silicon products until N3P? I don't expect you to have the answer, just wondering if you have any thoughts on the matter.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,215
Reaction score
8,264
From my reading, the rumor claims that the A17 will start out on N3B, then Apple will replace it with a functionally identical (not physically identical) model on N3E. Or at least close enough that they could market it that way. It's fuzzy, I could certainly be wrong, which is why I lumped it in with silly season.

Putting the actual rumor aside, I guess what I'm personally interested in is the longevity of N3B, and Apple's use of it. Is this a short-term solution for the A17, with followup 3nm chips (M-series) moving to N3E, or are we likely to see Apple continue to use N3B for all of its silicon products until N3P? I don't expect you to have the answer, just wondering if you have any thoughts on the matter.

I have no thoughts on it other than that Apple has to make these sorts of decisions about 18 months in advance.
 

Jimmyjames

Site Champ
Posts
634
Reaction score
708
From my reading, the rumor claims that the A17 will start out on N3B, then Apple will replace it with a functionally identical (not physically identical) model on N3E. Or at least close enough that they could market it that way. It's fuzzy, I could certainly be wrong, which is why I lumped it in with silly season.

Putting the actual rumor aside, I guess what I'm personally interested in is the longevity of N3B, and Apple's use of it. Is this a short-term solution for the A17, with followup 3nm chips (M-series) moving to N3E, or are we likely to see Apple continue to use N3B for all of its silicon products until N3P? I don't expect you to have the answer, just wondering if you have any thoughts on the matter.
Confused by this rumor. Everything I’ve seen shows that N3E is better than N3B. I don’t understand why it’s been portrayed in a poor way by macrumors. Ive seen figures showing N3B giving a 10%-15% perf advantage while N3E gives 15%-20% perf advantage. Why would anyone be concerned by this move? Apple would have gone straight to N3E is possible.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
Confused by this rumor. Everything I’ve seen shows that N3E is better than N3B. I don’t understand why it’s been portrayed in a poor way by macrumors. Ive seen figures showing N3B giving a 10%-15% perf advantage while N3E gives 15%-20% perf advantage. Why would anyone be concerned by this move? Apple would have gone straight to N3E is possible.
Courtesy of Anandtech:

Screenshot 2023-06-24 at 10.20.16 AM.jpg


From my basic understanding, if you care about cache, you'll get a 5% density advantage from N3P. If you care about performance, then between 3% and 8% for N3E. None of this likely matters to the end user, we're not talking about vast differences. I'm curious purely from a technical standpoint. MacRumors is making noise about nothing, as is tradition.
 
Top Bottom
1 2