X86 vs. Arm

I haven't read the article, but you've got your history mixed up a bit. Jay Miner and several friends left Atari to found Hi-Toro, which was later renamed Amiga. Miner was the principal designer of Atari's 8-bit computer chipset, and Amiga's chipset was essentially version 2.0.
Also, Tramiel did want to develop a 32-bit computer at Commodore. It was Irving Gould, a big Commodore shareholder, who thought the future was unimportant becasuse the present was so good, and managed to push Tramiel out.

Wow, I must have been really tired when I wrote this...
Of course I know that Jay Miner designed the Atari 800, and I even went to his Wikipedia entry to check that I had the correct designer.
I just checked the book On the Edge (2007, p. 396), which I probably read a decade ago:
Unfortunately, the naive Atari management thought they could design tomorrow‘s computers around today‘s expectations. They thought the $100 price tag on the 68000 was too much, and apparently could not conceive of a time in the future when the chip would cost less. The management was basking in their present success, unconcerned with the future of technology. It was obvious Miner could not be allowed to advance technology at Atari so he quit.

I can only guess my mind always associated the more general term management with Tramiel.
Thanks for the corrections, @mr_roboto.
But I still don‘t get the reference in the article, because Tramiel cannot be responsible for the demise of the Amiga, no matter what kind of manager he might have been.

Someone in the comment section of the article also wrote that management couldn‘t be the deciding factor, because Acorn, Atari, and Commodore all failed against the PC market.
Only the Mac held its niche (probably due to its heavy use in specialized markets), and ARM was in a totally different market altogether.
 
But I still don‘t get the reference in the article, because Tramiel cannot be responsible for the demise of the Amiga, no matter what kind of manager he might have been.
I had a look at it, and I don't think he's actually saying Tramiel personally killed the Amiga? He's saying Tramiel's "business is war" mentality ended up killing both Commodore and Amiga, long after Tramiel was gone. It's not very good writing though.

Someone in the comment section of the article also wrote that management couldn‘t be the deciding factor, because Acorn, Atari, and Commodore all failed against the PC market.
Only the Mac held its niche (probably due to its heavy use in specialized markets), and ARM was in a totally different market altogether.
Can't say I agree with that commenter. A better-run Atari or Commodore could have done a lot better against the PC, IMO.
 
I had a look at it, and I don't think he's actually saying Tramiel personally killed the Amiga? He's saying Tramiel's "business is war" mentality ended up killing both Commodore and Amiga, long after Tramiel was gone. It's not very good writing though.


Can't say I agree with that commenter. A better-run Atari or Commodore could have done a lot better against the PC, IMO.
I recommend the books by Brian Bagnall, who explains in great detail the various strategic mistakes commodore made.
 
Part 2 of ”how to design a CPU” is posted.

 
Just stumbled across an excellent post about RISV-V and wanted to share it: https://lobste.rs/s/icegvf/will_risc_v_revolutionize_computing#c_8wbb6t

And relevant discussion with interpretation from RTW: https://www.realworldtech.com/forum/?threadid=209889&curpostid=209889

I like this bit (by user --- from the RTW discussion)

I haven’t looked too deeply into risc-v, but whenever i look at it my gut response is that it feels “academic.” The folks in charge obviously know CPUs as well as anyone, but professors and people who have to sell stuff come at problems differently sometimes.

of course, countless chips have toy processors in them that are used as simple controllers where performance isn’t important, and risc-v may take over that role. And you never know— things that seem imperfect and incomplete often disrupt markets from the bottom up.

But some of the design decisions in risc-v do seem to limit its potential to really attack the meat of the Arm market.
General article on RISC-V's potential for commercialization, by Sophia Chen in MIT Technology Review:

 
of course, countless chips have toy processors in them that are used as simple controllers where performance isn’t important, and risc-v may take over that role.

ARM version one is public domain, well trod, powerful and easy to build. Why would you mess around with RISC-V when you can just make it ARM? (Granted, Thumb is still under patent, so you best be OK with using the wide ISA).
 
ARM version one is public domain, well trod, powerful and easy to build. Why would you mess around with RISC-V when you can just make it ARM? (Granted, Thumb is still under patent, so you best be OK with using the wide ISA).

I guess the two reasons would be to save money and to have more freedom to customize it?

The thing people sometimes don’t think about, though, is that just because you use RISC-V doesn’t mean somebody won’t come along and sue you for patent infringement. At least if you use Arm, there’s a pretty good chance that Arm will defend the lawsuit and indemnify you if you lose. I don’t think the RISC-V guys are going to come running to your defense.
 
Nice thread. One thing to remember is Apple Silicon and ARM are not the same thing per se.

Apple only uses the ARM ISA not the ARM SOC core designs (like Cortex). The all important microarchitecture is 100% Apple in house designs. So Apple is only "ARM" in the sense that it currently uses the ARM ISA. It is better thought of as RISC, which also points up the contrast with x86 better.
 
Apple only uses the ARM ISA not the ARM SOC core designs (like Cortex). The all important microarchitecture is 100% Apple in house designs. So Apple is only "ARM" in the sense that it currently uses the ARM ISA. It is better thought of as RISC, which also points up the contrast with x86 better.

I'm not sure I agree with that last bit. x86 is also not just the Intel Core series of micro-architectures. AMD has independently designed x86 designs. x86 does not refer to a singular series of designs, and neither does ARM. Nomenclature wise, RISC is in contrast to CISC, not x86. The ARM family of ISAs (including both 32 and 64-bit and thumb) as a family of ISAs is nomenclature wise the reciprocal to x86 as a family composing 8086's "real mode" up to the protected modes supported by the i386 family with the x87 FPUs and AMD's 64-bit extension.

ARM may also denote the company, but if you specifically talk about their designs I would explicitly say Cortex A-whatever or another full qualified name. And if I talk about Apple's core designs I will denote them as such, Firestorm, Icestorm, Blizzard, etc.

But yes, it should be communicated in such a way that nobody comes under the impression Apple just changed vendor from Intel to some other chipmaker or is buying core designs and just doing everything around it, but I would think that is already rather clear, if nothing else purely by how much ahead Apple's designs are relative to other ARM-based chips
 
True partly. Actually we should probably say x64 instead of x86 anyway. Yes RISC is the proper contrast to CISC also. I just dislike referring to Apple Silicon as ARM because it causes confusion and makes people think any old ARM based design is as performant. I suspect this is partly whey Apple itself has gone to pains to not identify their silicon as ARM in their promotional materials or anywhere else.
 
True partly. Actually we should probably say x64 instead of x86 anyway. Yes RISC is the proper contrast to CISC also. I just dislike referring to Apple Silicon as ARM because it causes confusion and makes people think any old ARM based design is as performant. I suspect this is partly whey Apple itself has gone to pains to not identify their silicon as ARM in their promotional materials or anywhere else.
I suspect that Apple intends to diverge from the ARM instruction set over time, maybe even changing instruction sets entirely. That, and the fact that the vast majority of users don’t care what ISA is being used, are pretty good reasons for apple not to mention Arm very often.
 
I suspect that Apple intends to diverge from the ARM instruction set over time, maybe even changing instruction sets entirely. That, and the fact that the vast majority of users don’t care what ISA is being used, are pretty good reasons for apple not to mention Arm very often.

That's going to suck if they do, to be honest. It might be the turning point where Apple stops being my primary machine, along with the primary machine of many developers that are Apple + Linux. The fact that they built their own virtualization framework to help bring vendors like VMWare over to Apple Silicon and built out a boot architecture for Mac that enables teams like Asahi seems to be contradictory to a world where Linux is basically unusable if Apple were to build their own ISA or break compatibility with one that Linux uses. That said, I agree there are plenty of marketing reasons to not mention ARM much except at WWDC.

Using the ARM ISA with extensions hidden behind frameworks like they are seems like a really good place for Apple to be when it comes to developers and keeping the ability for folks to do Linux/server work on a Mac, including Apple's own teams. Hell, I half expect that supporting virtualization was an internal requirement.
 
I suspect that Apple intends to diverge from the ARM instruction set over time, maybe even changing instruction sets entirely. That, and the fact that the vast majority of users don’t care what ISA is being used, are pretty good reasons for apple not to mention Arm very often.

Diverge in that they may add more custom instructions, sure. But I highly doubt they will change the foundations within the next 20 years. Rosetta is great, but Rosetta is not intended to stay. And not everything works on top of Rosetta anyway. Transition periods hurt, even if technologies like Rosetta make them hurt less. And then there's what Nycturne said above.
Of course licensing issues may pop up to force their hand but I wouldn't bet on intentionally moving away from ARM as a baseline. Co-processors, custom instructions, implementation specific registers - all possible; But not a fundamentally different ISA, I think
 
I suspect that Apple intends to diverge from the ARM instruction set over time, maybe even changing instruction sets entirely. That, and the fact that the vast majority of users don’t care what ISA is being used, are pretty good reasons for apple not to mention Arm very often.
The already do diverge some as the Apple Silicon "ISA" is a superset of ARM. Probably you will see more adds over time.
 
I occasionally encounter the assertion "ISA is not CPU architecture", though, it does include "architecture" right in the name. The claim is that x86 gets the work done even with its kludgey instruction set. And its limited, dedicated register set. One has to be impressed how good the engineers are at making x86 code run well, which is obviously aided by compilers that have advanced to the point that they produce the most efficient code possible.

But, a processor can only do work as framed by code, so the ISA does form a very large component of the achitecture. I am just curious about how this argument works. What are the elements of architecture that are distinctly unaffected by the ISA?
 
I occasionally encounter the assertion "ISA is not CPU architecture", though, it does include "architecture" right in the name. The claim is that x86 gets the work done even with its kludgey instruction set. And its limited, dedicated register set. One has to be impressed how good the engineers are at making x86 code run well, which is obviously aided by compilers that have advanced to the point that they produce the most efficient code possible.

But, a processor can only do work as framed by code, so the ISA does form a very large component of the achitecture. I am just curious about how this argument works. What are the elements of architecture that are distinctly unaffected by the ISA?

We have words for all this. People confuse microarchitecture with architecture. When they start talking about caches, how many parallel pipes there are, how out-of-order things can be, etc., that’s microarchitecture, not architecture. (Though the job title “architect” covers both).

Things like how many architectural registers there are, the ISA, etc., are “architecture.” Things like how many physical registers there are, how deep the reorder queue is, etc. is microarchitecture.
 
We have words for all this. People confuse microarchitecture with architecture. When they start talking about caches, how many parallel pipes there are, how out-of-order things can be, etc., that’s microarchitecture, not architecture. (Though the job title “architect” covers both).

Things like how many architectural registers there are, the ISA, etc., are “architecture.” Things like how many physical registers there are, how deep the reorder queue is, etc. is microarchitecture.
And of course there's VLIW off in its own weird world where many things that are microarchitecture in other contexts become part of the architectural spec. The exception which proves the rule.
 
VLIW has been commercially implemented twice – by Intel and by Transmeta. The Intel version has been EoLed for several years now, being way too hot and not all tat impressive. The Transmeta design had some small amount of success, but not enough to keep them from going under.
 
Back
Top