M2 Pro and M2 Max

exoticspice1

Site Champ
Posts
298
Reaction score
101
i was interviewing for a chip design job, i think when exponential went under (it’s a bit of a blur). The interviewers were all asking me software questions instead of hardware questions, and had weird ideas about how you design chips. Probably because they really did design chips in the weird ways they were suggesting. Which isn’t a good way to design chips. Anyway…
Interesting. It's really strange how companies are their own little shells. I guess Nvidia must be doing something right with regards to chip design they do make some powerful and efficient chips nowadays.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,355
Reaction score
8,570
Interesting. It's really strange how companies are their own little shells. I guess Nvidia must be doing something right with regards to chip design they do make some powerful and efficient chips nowadays.
This was a long time ago, Probably late 1990’s.
 

casperes1996

Power User
Posts
187
Reaction score
175
I recall the PhysX stuff, and it really was pretty scummy. (And I’m the least AMD-fanboy you’ll find - hell, I caused a major kerfuffle when they bought ATI and that was part of what led me to leave).
I do personally think I have a bit of an anti-nVidia pro-AMD bias. In part because nVidia tries to get people to rely on CUDA and such to lock them out of switching to competitor hardware while AMD relies more on open standards. Now this is naturally just because nVidia is the big player and AMD is the underdog, and were the tables turned I'm sure AMD would try similar moves. It's also not that different to Apple having us write Metal to lock our code into macOS or similar, but I'm more against it on the GPU side than the systems side for a variety of reasons some more reasonable than others. In any case however nVidia has undeniably done many anti-consumer things throughout the years and definitely made things explicitly to run well on their hardware and crucially poorly on AMD. One thing is optimising for yourself. Another is intentionally trying to cripple performance on competitor hardware, which it seems like nVidia has done at least on a few occasions. They've also had bad practices in dealing with reviewers, trying to influence them and getting upset when reviewers call out problems, blacklisting them from review copies and such. And while that is their prerogative and they don't owe anyone review copies, they've come off as spiteful in their ways of deciding to do so in the past. And then of course there is naming. nVidia is by no means the only offender here, but just how many, very different GPUs, have not had the name GT 730? Or 1030 for that matter? With vastly different performance and even architectural generation. Making it completely opaque the the customer what product they are even buying and hard to compare against other models and find reviews and pricing comparisons. nVidia is far from alone in the game of bad practices, but they certainly play it well
I interviewed at nVidia once. It was bizarre.


i was interviewing for a chip design job, i think when exponential went under (it’s a bit of a blur). The interviewers were all asking me software questions instead of hardware questions, and had weird ideas about how you design chips. Probably because they really did design chips in the weird ways they were suggesting. Which isn’t a good way to design chips. Anyway…
Is it possible to get further elaboration? Or do you not remember much or is there legally something that would prohibit that?
Do you know the reasoning for software questions being a focus? Like was it remotely relevant software questions that would show an understanding of chip needs or have something to do with the way one defines chip designs in software or something? Or was it just software engineering interview questions like reverse a binary tree (reference to a common ad in this space...)
How did their design process seem weird?
 

Andropov

Site Champ
Posts
620
Reaction score
780
Location
Spain
I do personally think I have a bit of an anti-nVidia pro-AMD bias. In part because nVidia tries to get people to rely on CUDA and such to lock them out of switching to competitor hardware while AMD relies more on open standards. Now this is naturally just because nVidia is the big player and AMD is the underdog, and were the tables turned I'm sure AMD would try similar moves. It's also not that different to Apple having us write Metal to lock our code into macOS or similar, but I'm more against it on the GPU side than the systems side for a variety of reasons some more reasonable than others. In any case however nVidia has undeniably done many anti-consumer things throughout the years and definitely made things explicitly to run well on their hardware and crucially poorly on AMD.
I'd argue that while it's true that writing GPU code for iOS/macOS is now also tied to a proprietary standard (Metal), Apple did try in several occasions to push an open standard. They donated OpenCL, which didn't get widespread adoption due to NVIDIA pushing CUDA instead. Apple was for a long time one of the major players in pushing OpenCL adoption. The Mac Pro 6,1 was designed around GPU compute power, which at the time could only be leveraged with OpenCL (Metal was released 9 months later). They only left Khronos Group after repeated failed attempts to modernize OpenGL. Apple did literally everything they could to try to get modern, open standards to be adopted for GPU computing/rendering.
 

leman

Site Champ
Posts
643
Reaction score
1,198
I'd argue that while it's true that writing GPU code for iOS/macOS is now also tied to a proprietary standard (Metal), Apple did try in several occasions to push an open standard. They donated OpenCL, which didn't get widespread adoption due to NVIDIA pushing CUDA instead. Apple was for a long time one of the major players in pushing OpenCL adoption. The Mac Pro 6,1 was designed around GPU compute power, which at the time could only be leveraged with OpenCL (Metal was released 9 months later). They only left Khronos Group after repeated failed attempts to modernize OpenGL. Apple did literally everything they could to try to get modern, open standards to be adopted for GPU computing/rendering.

I think there is also a difference between proprietary APIs in an already proprietary OS and proprietary vendor APIs. I can fully understand that some people are not very happy about Apple going fully custom on their GPU APIs. But at the end of the day, that's their OS, their devices and their risk —if they alienate or frustrate their user base because the lack of incompatible software, that's on Apple. What Nvidia is doing instead is use their position as a market leader to lock in customers.

Besides, an argument can be made that Metal is a sufficiently different API which boasts its own design vision and has its own goals.
 

casperes1996

Power User
Posts
187
Reaction score
175
I'd argue that while it's true that writing GPU code for iOS/macOS is now also tied to a proprietary standard (Metal), Apple did try in several occasions to push an open standard. They donated OpenCL, which didn't get widespread adoption due to NVIDIA pushing CUDA instead. Apple was for a long time one of the major players in pushing OpenCL adoption. The Mac Pro 6,1 was designed around GPU compute power, which at the time could only be leveraged with OpenCL (Metal was released 9 months later). They only left Khronos Group after repeated failed attempts to modernize OpenGL. Apple did literally everything they could to try to get modern, open standards to be adopted for GPU computing/rendering.
Agreed. As I wrote originally I also have more and less reasonable reasons for think it's different to do it at a systems level than a parts level. Another reason being that even if you do buy a Mac, you can still run Linux on it. Or Windows. Granted, more of an option on Intel Macs, but with Asahi, becoming more possible on Apple Silicon too. And MoltenVK is not being blocked so they're not fighting the open standards even if they aren't explicitly cooperating with them in this case. And while not developed open-source, XNU is open source as well, so Apple does play in the open space - nVidia has some horses in the open races too though
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,355
Reaction score
8,570
Is it possible to get further elaboration? Or do you not remember much or is there legally something that would prohibit that?
Do you know the reasoning for software questions being a focus? Like was it remotely relevant software questions that would show an understanding of chip needs or have something to do with the way one defines chip designs in software or something? Or was it just software engineering interview questions like reverse a binary tree (reference to a common ad in this space...)
How did their design process seem weird?

It’s been a long while, but i do recall that the questions were pure software questions. Sorting, reference counting, table lookups, that sort of thing. Completely irrelevant to what would have been my job at any CPU design place, but perhaps relevant to how they designed GPUs.

It seemed to me that to design GPUs they wrote code that described the behavior of the GPU, then let automated tools create the physical design for that behavior (a process called synthesis). Lots of companies did that, but generally for ASICs where power and performance weren’t all that critical. Nobody I talked to at nVidia knew the ins and outs of designing optimal logic gates, how wires affect the timing of a design, or the math that static timing tools use to calculate delays and the shortcomings of those tools. Those are things you’d need to know if you weren’t relying on automation to do all the work for you. The problem is, at least back then, those tools were much worse than what humans could accomplish. Experimentally we found those tools to be at least 20% worse at power consumption and at clock speed.
 

casperes1996

Power User
Posts
187
Reaction score
175
It’s been a long while, but i do recall that the questions were pure software questions. Sorting, reference counting, table lookups, that sort of thing. Completely irrelevant to what would have been my job at any CPU design place, but perhaps relevant to how they designed GPUs.

It seemed to me that to design GPUs they wrote code that described the behavior of the GPU, then let automated tools create the physical design for that behavior (a process called synthesis). Lots of companies did that, but generally for ASICs where power and performance weren’t all that critical. Nobody I talked to at nVidia knew the ins and outs of designing optimal logic gates, how wires affect the timing of a design, or the math that static timing tools use to calculate delays and the shortcomings of those tools. Those are things you’d need to know if you weren’t relying on automation to do all the work for you. The problem is, at least back then, those tools were much worse than what humans could accomplish. Experimentally we found those tools to be at least 20% worse at power consumption and at clock speed.
I see. Does this work similarly to writing something akin to the micro-code pseudo code Intel publishes and it generating circuits based on that? Probably with more information too so every chip doesn't come out the same.

But yeah that's interesting. Even going far back they never seem (as I recall) to have been that far behind any competition so they must have somehow made it work decently for them even if it could have been more optimal than it was
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
I recall the PhysX stuff, and it really was pretty scummy. (And I’m the least AMD-fanboy you’ll find - hell, I caused a major kerfuffle when they bought ATI and that was part of what led me to leave).
This is interesting to me, from a historical perspective. Why did you oppose the ATI purchase? Looking back today, do you still think it was a mistake?
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,355
Reaction score
8,570
I see. Does this work similarly to writing something akin to the micro-code pseudo code Intel publishes and it generating circuits based on that? Probably with more information too so every chip doesn't come out the same.

Not exactly. It’s more detailed than that. Each unit is described in code. So, for example, for the instruction decoder there would be a module (or function or subroutine, depending on the language used) that describes the work done by that unit. Along the lines of:

when CLK rises:
opcode <- input[3:0]
operandA <- input[…
….
let ALUop = (opcode[0] == 1)


Then the synthesis tool figures out what logic gates perform those functions, which variables persist across clock boundaries and need to be stored in Flipflops, etc., and produces a text file that describes all the logic gates and how they are connected to each other.

Then they take that file (called a netlist) and feed it into a tool that picks sizes for each logic gate, decides where the gates go on the chip, and then draw metal wires to connect them all. (That part of the process is called “place and route”).

Then you run timing analysis on it, find out your clock speed is off by 50%, and you start modifying the input source code to try and coax the tools to do a better job.

This flow makes me itch with frustration, but it is certainly incredibly common and even used for parts of high end CPUs today. (I am happy to say I never used it even once other than as part of experiments).

But yeah that's interesting. Even going far back they never seem (as I recall) to have been that far behind any competition so they must have somehow made it work decently for them even if it could have been more optimal than it was

Well, the competition was doing that too, presumably. The market was different - instead of designing the best product that could be designed for a particular process node (which the CPU folks do), they design the best product they can release on a specific date. Because every 9 months (or whatever) they will release another product.

Any CPU design team could bury them, but it took 2+ years to design a product that way.
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,355
Reaction score
8,570
This is interesting to me, from a historical perspective. Why did you oppose the ATI purchase? Looking back today, do you still think it was a mistake?

It wasn’t so much a business thing. Google me and bulldozer and you’ll see what i predicted would happen.

Say you’re newly managing a chip design company and don’t know anything about designing chips. You have these CPU guys that designed Opteron/Athlon 64/x86-64, but they said it would take 2 years and instead it took 2.5. That’s 6 months of lost sales! Huge failure. And the consultants you hired from IBM say your guys are too slow designing chips and wouldn’t you like to buy all of IBM’s tools which were so successful with PowerPC and maybe number your bits in reverse?

You buy ATI, and the ATI folks are designing a new chip every 9 months.

Hmm. Chips are chips. Why not have the GPU guys show the CPU guys how it’s done? I mean, what could go wrong?
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
Hmm. Chips are chips. Why not have the GPU guys show the CPU guys how it’s done? I mean, what could go wrong?
Oh no. I'm nowhere close to being a chip designer, and even I could predict how that was going to go.

That being said, we're all familiar with your thoughts on Intel. What are your thoughts on the current state of AMD?

...that's assuming that you have any. When Michael J. Fox asked David Letterman what Americans thought of Canada, Letterman responded, "We don't".
 

Cmaier

Site Master
Staff Member
Site Donor
Posts
5,355
Reaction score
8,570
Oh no. I'm nowhere close to being a chip designer, and even I could predict how that was going to go.

That being said, we're all familiar with your thoughts on Intel. What are your thoughts on the current state of AMD?

...that's assuming that you have any. When Michael J. Fox asked David Letterman what Americans thought of Canada, Letterman responded, "We don't".
I don’t have any particular thoughts on it. I recently chatted with a guy who is still there and he says morale has improved. Apparently there are about five folks still there from my days.

But since I will never own an AMD product, I don’t have a lot of reason to think about them much anymore.
 

Colstan

Site Champ
Posts
822
Reaction score
1,124
But since I will never own an AMD product, I don’t have a lot of reason to think about them much anymore.
I have an interest in the other side of the pond, but it comes and goes. A couple of years ago, if you had told me that Nvidia were offering a 2060 Ti, then I would have no idea what you were talking about. I hadn't followed PC tech since I switched in 2005, except for whatever Apple was putting inside of Intel Macs. In the past year or so, I had considered building a gaming PC, like in "the olden days", but after looking at what is involved, my interest has waned considerably.

I'm not completely ruling it out, but dealing with heat, noise, a big ass case, and Windows (burn it with fire!), shows that I just don't have the same passion for it that I once did. Back when I was young and stupid, I enjoyed the process of building a PC and tweaking drivers. Now that I have grown to be old and stupid, that all just seems like a big chore.

Linus recently put out a video with tips for improving the Windows experience, or at least attempting to. He starts out by saying "Windows has a lot of problems". I'm glad he's here to tell us these things. Another channel posted an optimization guide for Windows 11 gaming. It's 45 minutes, I tapped out after 10 minutes. Thankfully, I'm fortunate that 90% of the games that I do play are turn-based isometric RPGs, a specific niche, but one which there are Mac versions for essentially all of them.

It was a fanciful idea from times past, but now I just want a computer that is small, quiet, requires no maintenance, and runs an operating system that doesn't actively hate me. I can get that with a Mac, hence it's unlikely that I will ever own another product from Intel, Nvidia, or AMD, either.
 

casperes1996

Power User
Posts
187
Reaction score
175
I have an interest in the other side of the pond, but it comes and goes. A couple of years ago, if you had told me that Nvidia were offering a 2060 Ti, then I would have no idea what you were talking about. I hadn't followed PC tech since I switched in 2005, except for whatever Apple was putting inside of Intel Macs. In the past year or so, I had considered building a gaming PC, like in "the olden days", but after looking at what is involved, my interest has waned considerably.

I'm not completely ruling it out, but dealing with heat, noise, a big ass case, and Windows (burn it with fire!), shows that I just don't have the same passion for it that I once did. Back when I was young and stupid, I enjoyed the process of building a PC and tweaking drivers. Now that I have grown to be old and stupid, that all just seems like a big chore.

Linus recently put out a video with tips for improving the Windows experience, or at least attempting to. He starts out by saying "Windows has a lot of problems". I'm glad he's here to tell us these things. Another channel posted an optimization guide for Windows 11 gaming. It's 45 minutes, I tapped out after 10 minutes. Thankfully, I'm fortunate that 90% of the games that I do play are turn-based isometric RPGs, a specific niche, but one which there are Mac versions for essentially all of them.

It was a fanciful idea from times past, but now I just want a computer that is small, quiet, requires no maintenance, and runs an operating system that doesn't actively hate me. I can get that with a Mac, hence it's unlikely that I will ever own another product from Intel, Nvidia, or AMD, either.
I agree with most of this, but I do have a great interest in video games (also co-host the MacGameCast so the overlapping interests are telling. Unfortunately most games still do not get a port to macOS and many ports that do exist are frankly horrible offering less than half the performance on the same hardware (bootcamp proves it). I have my iMac with Bootcamp right now in addition to my M1 Max laptop, but in the future it is possible I would have a PC that does absolutely nothing but act as a gaming supplement to my Macs. More likely however, I will just get a PlayStation for the non-Mac titles. Regardless however consoles also come with AMD hardware (at present at least) so there would be AMD hardware in whatever it would be. And competition drives progress in some cases too, so an interest in what AMD, Intel and nVidia do will also be relevant to Apple - If the whole industry does X that turns out to be a super good move, it is likely Apple will eventually also adopt X as a strategy or whatever.
 

Andropov

Site Champ
Posts
620
Reaction score
780
Location
Spain
I agree with most of this, but I do have a great interest in video games (also co-host the MacGameCast so the overlapping interests are telling. Unfortunately most games still do not get a port to macOS and many ports that do exist are frankly horrible offering less than half the performance on the same hardware (bootcamp proves it). I have my iMac with Bootcamp right now in addition to my M1 Max laptop, but in the future it is possible I would have a PC that does absolutely nothing but act as a gaming supplement to my Macs. More likely however, I will just get a PlayStation for the non-Mac titles. Regardless however consoles also come with AMD hardware (at present at least) so there would be AMD hardware in whatever it would be. And competition drives progress in some cases too, so an interest in what AMD, Intel and nVidia do will also be relevant to Apple - If the whole industry does X that turns out to be a super good move, it is likely Apple will eventually also adopt X as a strategy or whatever.
I gave up on playing on macOS and got a PS5. Way less frustrating (the ports are usually terrible, as you say, and many have a bad habit of hijacking the computer audio, which really annoys me). The situation may change in a few years though, Apple is putting very capable GPUs in every single Apple Silicon Mac out there, so companies may start paying more attention to the Mac in a few years once few Intel Macs remain.
 

exoticspice1

Site Champ
Posts
298
Reaction score
101
I gave up on playing on macOS and got a PS5. Way less frustrating (the ports are usually terrible, as you say, and many have a bad habit of hijacking the computer audio, which really annoys me). The situation may change in a few years though, Apple is putting very capable GPUs in every single Apple Silicon Mac out there, so companies may start paying more attention to the Mac in a few years once few Intel Macs remain.
I gave up gaming on Mac as well. My MacBook is strictly for work and my PC for gaming and tinkering.
 

exoticspice1

Site Champ
Posts
298
Reaction score
101
Regardless however consoles also come with AMD hardware (at present at least) so there would be AMD hardware in whatever it would be.
Funnily enough most Sony titles that come out on PC are more optimised for Nvidia. Not that surprising since they hold nearly 90% marketshare in external GPUs.
 

casperes1996

Power User
Posts
187
Reaction score
175
I gave up on playing on macOS and got a PS5. Way less frustrating (the ports are usually terrible, as you say, and many have a bad habit of hijacking the computer audio, which really annoys me). The situation may change in a few years though, Apple is putting very capable GPUs in every single Apple Silicon Mac out there, so companies may start paying more attention to the Mac in a few years once few Intel Macs remain.
Ultimately probably a wise move. At least as things are and have been the past many years. But as someone who needs a powerful Mac for their work, it's already there, so might as well play on it too. Especially as long as I don't have space for a nice TV and PS5 as well. And my maxed out 2020 iMac runs games rather well through Bootcamp.

Out of curiosity, what do you mean games hijack the audio? I don't think I've experienced this
 
Top Bottom
1 2