The thermodynamics of Computation

My messages to the everything list

On 09.02.2012 07:49 meekerdb said the following:

There’s an interesting paper by Bennett that I ran across, which
discusses the relation of Shannon entropy, thermodynamic entropy, and
 algorithmic entropy in the context of DNA and RNA replication:

10.02.2012 22:16:

Thank you for the link. I like the first sentence

Computers may be thought of as engines for transforming free energy into waste heat and mathematical work.”

I am not sure though if this is more as a metaphor. I will read the paper, the abstract looks nice.

I believe that there was a chapter on reversible computation in

Nanoelectronics and Information Technology, ed Rainer Waser

I guess, reversible computation is kind of a strange attractor for engineers.

As for DNA, RNA, and proteins, I have recently read

Barbieri, M. (2007). Is the cell a semiotic system? In: Introduction to Biosemiotics: The New Biological Synthesis. Eds.: M. Barbieri, Springer: 179-208.

If the author is right, it well might be that the language was developed even before the consciousness. By the way, the paper is written very well and I have to think it over.

A related discussion

18.02.2012 16:49:

I have browsed the paper. It is nice indeed. A couple of comments.

1) Reversible computation

The author seems not to reject the idea of reversible computation. This, in my view, shows that the first statement from the paper

Computers may be thought of as engines for transforming free energy into waste heat and mathematical work.”

just does not work literally. If reversible computation is possible, then we do not have any thermodynamic limits in this respect. What is left is just a thermal noise in form of kT.

2) Maxwell demon

I have never understood a problem with the Maxwell’s demon. Why it is not enough to say that it does not exist? Why for example Maxwell’s demon touches the imagination of physicists and engineers and the idea of the God not?

3) Reversible chemical reactions and reversible thermodynamic processes

I think that the author misuses the term reversible in a sense that the word has completely different meaning in thermodynamics and in chemistry. In thermodynamics, the reversible process implies that the entropy of the system and surrounding does not change (the entropy of the Universe remains constant). In chemistry, a term reversible reaction means we have two reactions (forward and backward) running in parallel. Thereafter, by playing with conditions we could transform A to B and then B back to A. However, when a reversible chemical reaction takes place it is impossible to implement it as a reversible thermodynamic process. Hence a reversible chemical reaction is not thermodynamically reversible.

4) Algorithmic entropy

I have missed the point on the connection between the algorithmic entropy and thermodynamic entropy. Here would be good to be back to the Jason’s example from about his work on secure pseudo-random number generators

What a thermodynamic system should be considered at all here?

In my view, the algorithm is independent of implementation details. It seems that this is one of the points at this list when people claim that it could be possible to make a conscious robot. Yet, how then the thermodynamic entropy could be connected with the algorithmic entropy?

5) DNA, RNA and information

I have recently read

Barbieri, M. (2007). Is the cell a semiotic system? In: Introduction to Biosemiotics: The New Biological Synthesis. Eds.: M. Barbieri, Springer: 179-208.

and below there is a quote. It is quite a different viewpoint on the processes in the cell.

At this point, we can summarize all the above concepts by saying that in protein synthesis:

  • (1) Organic information is the sequence used by a copymaker during a copying process.
  • (2) An organic sign is the sequence used by a codemaker during a coding process.
  • (3) An organic meaning is the sequence produced by a codemaker during a coding process.
  • (4) Organic information, organic signs and organic meaning are neither quantities nor qualities. They are a new kind of natural entities which are referred to as nominable entities.
  • (5) Organic information, organic signs and organic meaning have the same scientific status as physical quantities because they are objective and reproducible entities which can be defined by operative procedures.
  • (6) Organic information, organic signs and organic meaning have the same scientific status as fundamental physical quantities because they cannot be reduced to, or derived from, simpler entities.”

19.02.2012 11:32:

to Maxwell demon

“If one well defines a thought experiment with the Maxwell’s demon, then it is quite clear that such thing does not exist. Why then to spend on it so much time?”

to reversible computation

“It is hard to say for sure what the author meant. Let me first quote him.

p. 912(8) “It is well known that all chemical reaction are in principle reversible: the same Brownian motion that accomplishes the forward reaction also sometimes brings product molecules together, pushes them backward through the transition state, and lets them emerge as reactant molecules.”

p. 934(30) “As indicated before, the synthesis of RNA by RNA polymerase is a logically reversible copying operations, and under appropriate (nonphysiological) conditions, it could be carried out at an energy cost of less than kT per nucleotide.”

My understanding was that in the first quote reversible has the meaning from chemistry. Let us consider for example a reaction

A = B

with the forward reaction rate of 1000 and the backward reaction rate of 1. Then we can imagine two different initial states

1) C(A) = 1, C(B) = 0
2) C(A) = 0, C(B) = 1

The equilibrium state will be the same, but we reach it from different sides. In both cases however the process will be thermodynamically irreversible.

My point was that one word has different meanings and it would be good to understand what has been meant.”

to algorithmic entropy

“I have read once more the section 6 “Algorithmic entropy and thermodynamics” (p. 936 (30)) from the paper. I should confess that I do not know exactly what the author meant with the algorithmic entropy. My reading was

algorithmic entropy == entropy of an algorithm

and I have considered and will stick to this meaning.

In my understanding, when we consider an algorithm, this is a pure IT construct, that does not depend whether I will implement it with an abacus or some Turing machine, with Intel or PowerPC processor. From this follows that the algorithm and hence its entropy does not depend on temperature or pressure of a physical system that does the computation. In my view it makes sense.

Let us consider consciousness now. Our brains produces it and our brain has some thermodynamic entropy. If we assume that the same effect could be achieved with some robot, does it mean that the thermodynamic entropy of the robot must be the same as that of the brain?”

to semiotics

“Everything is in comparison. Recently I got interested in Artificial Life and people have recommended me Christoph Adami “Introduction to Artificial Life“. He for example claims

p. 5 “An even more general approach is the thermodynamic one, which attempts to define living systems in terms of their ability to maintain low levels of entropy, or disorder, only“.

One could say something like this but then the question what is the entropy. And this is what Adami writes about the entropy

p. 94 “Entropy is a measure of the disorder present in a system, or alternatively, a measure of our lack of knowledge about this system.”

p. 96 “If an observer gains knowledge about the system and thus determines that a number of states that were previously deemed probable are in fact unlikely, the entropy of the system (which now has turned into a conditional entropy), is lowered, simply because the number of different possible states in the lower. (Note that such a change in uncertainty is usually due to a measurement).”

p. 97 “Clearly, the entropy can also depend on what we consider “different”. For example, one may count states as different that differ by, at most, del_x in some observable x (for example, the color of a ball drawn from an ensemble of differently shaded balls in an urn). Such entropies are then called fine-grained (if del_x is small), or course-grained (if del_x is large) entropies.”

I am a thermodynamicist and frankly speaking I was just shocked after reading it. For me it was clear that Adami does not know what the experimental thermodynamics is (and presumably he is unaware of the experimental thermodynamics at all).

I understand now that Adami’s viewpoint is quite common among physicists but I do not think that this brings us “useful conjecture come out of it”. I had discussion about this on the biotaconv list, see summary at


but no one there could explain me what is the differences in consequences in artificial life research between the two statements

  1. The thermodynamic and information entropies are equivalent.
  2. The thermodynamic and information entropies are completely different.

It seems that either 1) or 2) does not influence artificial life research at all.

In this sense, I like the Barbieri’s paper much more. At least I could follow his logic.”

19.02.2012 17:51:

to Maxwell demon

“I am not sure I understand what do you mean. How Maxwell’s demon is possible in classical physics? I personally would say that Maxwell’s demon could work just within a brain of some crazy scientist.”

20.02.2012 19:33:

I have nothing against Adami’s book as such. His description of his software avida and his experiments with it are okay. My point was about his claim that his work has something to do with thermodynamics. It is definitely not. The thermodynamic entropy is not there. The quotes from the book displays this pretty clear.

You have written about “an analogous role”. I would not object if you say that there is an analogy between the thermodynamic entropy and information. Yet, I am against the statement that the thermodynamic entropy is information and I believe that I have given many examples that show this. Thermodynamic entropy is not subjective and not context dependent*, so my claim is that Adami does not understand what the thermodynamic entropy is. He has never taken a class in experimental thermodynamics, this is the problem.

* I would accept the notation that the entropy is context dependent in a sense that its definition depends on the thermodynamics theory. If we change the theory, then the entropy could have some other meaning. But it seems not what you have meant.

20.02.2012 21:02:

I have done a class in statistical thermodynamics. Actually it was a pretty good class where different approaches of Boltzmann, Gibbs and other have been considered in detail.

The difference is that I do not believe that a similar equation in different areas imply that the different things are the same.

If you would like to show that information is very useful in thermodynamics, please apply it to simple thermodynamic problems to show how the concept of information has simplified for example the computation of the phase diagram (or equilibrium composition between N2, H2 and NH3). Should I repeat my examples?

21.02.2012 21:03:

What does it mean for the application I have mentioned and for information in the IT? I still do not understand this, as the numerical values of information in IT and as derived from the thermodynamic entropy are quite different. Hence it is completely unclear how to use this in practical applications. Then what does it bring?

You have written about semiotics

“I’ve never seen one useful conjecture come out of it.”

What are useful conjecture from saying that because the equations for information and the entropy are the same, they must be the same thing?

21.02.2012 21:16:

You underestimate chemists. As I have mentioned they use molecular simulation extensively. You can find some examples in my old lectures (they are a bit outdated though as they are about eight years old)

But Shannon’s information is not there.

25.02.2012 10:53:

On 20.02.2012 19:54 meekerdb said the following:

I’m beginning to think you have never taken a class in statistical
mechanics. There’s a good online course here:

Those particularly relevant to this thread start at

and go through the next six or seven.

I wanted to study this text to understand the relationship between the entropy and information better. However, I cannot find information in there, say I guess

How can we obtain some information about the statistical properties of the molecules which make up air?

you do not mean the term information in this sentence.

It seems that this is a normal course on statistical thermodynamics as I get used to where there is no notion of information in thermodynamics.

Have I missed something? How this link helps us in our discussion from your viewpoint?

Comments are closed.