Cool-running non-entropic computation (deeply nerdy)


Reaction score
knee deep in the road apples of the 4 horsemen
I remember reading about this some 35 years ago in this magazine.

Current Sci-Am Article

Computation is, by nature, entropic. You take "c = a + b". Upon performing the calcuation, you (theoretically) have a useful value in c. In the process, you have destroyed information, because "a + b", while it is not as useful to you as is "c", is more information. This is essentially the same on paper as it is in a CPU, because even though you have not physically destroyed the original terms, they no longer interest you, so you have mentally destroyed them.

It still appears to be a long way from anything that will be practically meaningful to any of us, but, if you are into this sort of thing, it is quite interesting.

This is what happens in logic circuitry. Source data goes through combining gates that reduce it to result data, destroying the source. This is why computers produce waste heat (entropy – exactly the same effect as when your car destroys fuel in order to produce the useful result being somewhere else).

The original article talked about balancing the reduction with a reversing structure, simultaeously destroying and restoring the source data. That, however, is difficult to implement in a design that does not become infinitely slow. The research in this area has not stood still, though. This article explores handling data as the momentum of charge rather than its location – kind of like switching te perspective from one side of Heisenberg's Uncertainty principle to the other.


Elite Member
Staff Member
Site Donor
Reaction score
We did some adiabatic circuits in the 90’s, and it was quite a fad back then. Performance/power trade off generally was not good.
Top Bottom
1 2