Schrödinger’s Order, Disorder and Entropy

Recently I have made a talk “Does Entropy Play a Role in Biology?” where I have made a short introduction to the thermodynamic entropy and discussed misconceptions of the entropy in biology. Now I see that I have missed to mention an important text that seemed to play a big role in raising one particular misconception of the entropy among biologists. This is Chapter 6 “Order, Disorder and Entropy” in Schrödinger’s What is Life?

Let me start with a quote from a paper that I have recently read

Fingelkurts, A., Fingelkurts, A., and Neves, C. (2010). “Natural World Physical, Brain Operational, and Mind Phenomenal Space-Time”. Physics of Life Reviews 7(2): 195-249.

The degree of disorder or lost energy is qualified as entropy.

This is inessential to the paper as such but it shows that the misconception created by Schrödinger still enjoys widespread use.

I see three major problems with Schrödinger’s chapter.

A system in a uniform environment

When a system that is not alive is isolated or placed in a uniform environment, all motion usually comes to a standstill very soon as a result of various kinds of friction; differences of electric or chemical potential are equalized, substances which tend to form a chemical compound do so, temperature becomes uniform by heat conduction. After that the whole system fades away into a dead, inert lump of matter. A permanent state is reached, in which no observable events occur. The physicist calls this the state of thermodynamical equilibrium, or of ‘maximum entropy’.”

An isolated system or a system in a uniform environment (which for the present consideration we do best to include as a part of the system we contemplate) increases its entropy and more a less rapidly approaches the inert state of maximum entropy.”

Let us consider these statements carefully.  It actually does not matter what system one takes, alive or not. In both cases the equilibrium state will be reached. It well might be that with an alive system, one should wait somewhat longer to reach equilibrium, however there is no difference in this respect. Any organism in such conditions will be dead and destroyed.

Now let us consider a system in a uniform environment. In this case, the maximum entropy principle does not apply. A simple example. Let us put a glass of hot water in a room with normal temperature. The temperature of water in the glass will decrease spontaneously but the entropy of the water will decrease as well. The maximum entropy principle is applied to an isolated system only and it is impossible to extend it to the system in a uniform environment.

It seems that Schrödinger needed a  system in a uniform environment to discuss the entropy of a biological organism as such when the organism interacts with the environment. However, in this case the Second Law states nothing whether the entropy of the organism must increase or decrease. Both cases are allowed and the real change in the entropy depends on particular processes taking place between the organism and environment.

One could say that Schrödinger understands the Second Law correctly (see brackets in the second paragraph). Yet in this case, the text employed is at least sloppy as it creates impression that the maximum entropy concerns not an isolated system only but also a system in an uniform environment as such.

Statistical concept of order and disorder

Schrödinger states explicitly that the entropy has something to do with order and disorder.

Much more important for us here is the bearing on the statistical concept of order and disorder, a connection that was revealed by the investigations of Boltzmann and Gibbs in statistical physics. This too is an exact quantitative connection, and is expressed by

entropy = k log D,

where k is the so-called Boltzmann constant ( = 3.2983 . 10-24 cal./C), and D a quantitative measure of the atomistic disorder of the body in question.”

The notion of the entropy as a measure of disorder could be traced to Boltzmann indeed and Schrödinger in his chapter repeats this idea without further thought.  However, formally speaking D in the equation above is the number of possible microstates constrained to a given macrostate. Its relationship with order and disorder happens not to be that straightforward. The main problem is that the intuition in respect to order and disorder works to a limited extent only. Let me present a couple of examples.

A) From thermodynamic tables, the mole entropy of silver at standard conditions S(Ag, cr) =  42.55 J K-1 mol-1 is bigger than that of aluminum  S(Al, cr) =  28.30 J K-1 mol-1. Does it mean that there is more disorder in silver as in aluminium?

B) Let us consider an example from Schrödinger’s chapter about sugar in water:

the sugar reached its aim of being equally distributed among all the liquid water available.”

The gradual ‘spreading out’ of the sugar over all the water available increases the disorder D, and hence (since the logarithm of D increases with D) the entropy.”

It is not that simple however. One should know that there is a maximum possible concentration of sugar in water after that sugar stays in equilibrium with a saturated solution of sugar in water. Now compare a potential state when all sugar equally distributed among the liquid water with that sugar + a saturated solution.  In which case we have more order or disorder?

Such a situation is well known for everybody who uses the entropy to solve practical problem. For example from

E. I. Kozliak, F. L. Lambert, “Order-to-Disorder” for Entropy Change? Consider the Numbers! Chem. Educator 2005, 10, 24–25 24

Defining entropy increase as a change from order to disorder is misleading at best and incorrect at worst.”

The situation gets even worse when one start mixing order/disorder with information. From Rudolf Arnheim, Entropy and Art: An Essay on Disorder and Order, 1971

“The absurd consequences of neglecting structure but using the concept of order just the same are evident if one examines the present terminology of information theory. Here order is described as the carrier of information, because information is defined as the opposite of entropy, and entropy is a measure of disorder. To transmit information means to induce order. This sounds reasonable enough. Next, since entropy grows with the probability of a state of affairs, information does the opposite: it increases with its improbability. The less likely an event is to happen, the more information does its occurrence represent. This again seems reasonable. Now what sort of sequence of events will be least predictable and therefore carry a maximum of information? Obviously a totally disordered one, since when we are confronted with chaos we can never predict what will happen next. The conclusion is that total disorder provides a maximum of information; and since information is measured by order, a maximum of order is conveyed by a maximum of disorder. Obviously, this is a Babylonian muddle. Somebody or something has confounded our language.”


The misconception that entropy is disorder plus a vague notation of a system in a uniform environment led Schrödinger to introduce the negentropy to describe biological systems. This is done in Schrödinger’s chapter in a vague language and I do not want to analyze now what Schrödinger would have thought. I just take a quote from Fingelkurts’ paper to show how biologists have understood Schrödinger:

Thus, changes in entropy provide an important window into self-organization: a sudden increase of entropy just before  the emergence of a new structure, followed by brief period of negative entropy (or negentropy).”

Such a statement is not even wrong. It has no meaning.


A sloppy language in Schrödinger’s chapter taken together with his authority in science led to dissemination of misconceptions among biologists. What a pity.


Howard Pattee on the biosemiotics list has mentioned the paper

M. F. Perutz.  Physics and the riddle of life.  Review of E. Schrödinger, What is life? Nature 326, 555-558, 1987

where Perutz concluded “What was true in his book was not original, and most of what is original was known not to be true even when it was written.”


3 responses to “Schrödinger’s Order, Disorder and Entropy”

Comments are now closed
  1. Sungchul Ji says:

    Hi, Evgneii,

    I agree with your analysis of Schroedinger.

    Just as von Neumann initiated the conflation between thermodynamic entropy and information by suggesting to Shannon to call his equation for information as the entropy equation, so Schroedinger initiated the conflation between order and negative entropy [which was later called ‘negentropy’ by L. Brillouin, Negentropy Principle of Information, J. Appl. Physi. 24(9): 1152-1163 (1953)] by blindly manipulating Boltzmann equation ignoring the constraints imposed by the Third Law of Thermodynamics which prohibits entropy being negative [S. Ji (2012). Third Law of Thermodynamics and ‘Schroedinger’s Paradox’. In: Molecular Theory of the Living Cell: Concepts, Molecular Mechanisms, and Biomedical Applications. Springer, New York, Section 2.1.5, pp. 12-15]. To undo the damages inflicted on the fields of biology, computer science, and philosophy of science by these prominent physicists and their followers, it may take world-wide, interdisciplinary efforts to educate the general scientific community on, and disseminate the correct relations between, thermodynamics and informatics. Blogs such as yours will play important roles in meeting this enormous challenge.

  2. Stan Salthe says:

    If you persist in using “information” when you should use ‘informational entropy (variety), you will perpetuate the confusion.

  3. You are right, I usually use the term informational entropy. However, I like Arnheim’s book and in this case I have used the term from the quote.