06.09.2024 The old text is changed to the translation of a new text in Russian:
Erwin Schrödinger and Negative Entropy
I have taken citations from English books and hence references have been changed. During translation I have used Google and Yandex.
__
In the book ‘What is Life?‘ Erwin Schrödinger have introduced the term ‘negative entropy’, that is so popular among biologists and intellectuals. Schrödinger explained his action this way:
‘The remarks on negative entropy have met with doubt and opposition from physicist colleagues. Let me say first, that if I had been catering for them alone, I should have let the discussion turn on free energy instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things.’
However, as a result, desire for a popular presentation using an irrelevant metaphor leads to the spread of misconceptions.
I do not want to belittle the significance of Schrödinger’s book ‘What is Life?‘ (see, for example, the article by historian Edward Joxen). We must acknowledge Schrödinger’s courage – that he decided to write such a book at all. And without any doubt he had the right to conduct the discussion in the way he did. However, this should not prevent criticism of his views, especially regarding negative entropy. Schrödinger himself knew statistical mechanics; he has a good textbook ‘Statistical Thermodynamics‘ (1944). The problem concerns people whose knowledge of thermodynamics is limited and who take Schrödinger’s metaphors literally.
Below I will consider the emergence of the entropy-as-disorder metaphor; it developed towards the end of the 19th century and remains in use to this day. Schrödinger takes this metaphor and transforms it into a form using negative entropy. I will describe the logic behind the emergence of the metaphor of negative entropy in Schrödinger’s book, and then show the irrelevance of the entropy-as-disorder metaphor and propose an alternative metaphor based on the concept of free energy.
- Entropy as Disorder in the 19th Century
- Schrodinger: Order, Disorder and Entropy
- Discussion: Order and Disorder
- Discussion: Understanding Chemical Reactions
Entropy as Disorder in the 19th Century
The first law of thermodynamics introduces the conservation of energy, and this raises the difficult question – what does it mean to ‘save energy’ when the energy is conserved. The main point is that not all of the energy can be used to produce work. In thermodynamics, therefore, we speak of free energy, although the right meaning of this term depends on the conditions of the process. As usual, a true understanding of what is happening can only be achieved by working equations out, but at the level of metaphors it is possible to state that the total energy in a system consists from free energy, which can be used to produce work, and a second part, which leads to the inevitable thermal processes.
In kinetic theory, work is associated with ordered motion, and heat with disordered. The entropy change in turn is related to heat, and this gives us the first path to the entropy-as-disorder metaphor. Historians give priority to Helmholtz in this regard – this way in 1882 he said:
‘In this sense, the magnitude of entropy could be described as the measure of disorder.’
In this case, what was meant was the connection between the entropy change and random motion during heat exchange.
The second route to the entropy-as-disorder metaphor is Boltzmann’s equation for entropy, which appeared in 1877. Boltzmann proposed a statistical explanation for the entropy increase of a monatomic ideal gas in an isolated system. The original paper did not discuss order and disorder at all; the only statement was that the movement toward equilibrium is equivalent to a transition from less probable states to more probable ones. Therefore, the equilibrium state in an isolated system is the most probable one.
However, later, less probable states were related to ordered ones, and more probable to disordered. Boltzmann himself switched to this language by the end of the 19th century; I will quote from ‘Lectures on Gas Theory‘ (1895-1898):
‘In particular, our theory does require that each time when bodies are interacting, the initial state of the system they form must be distinguished by a special property (ordered or improbable) which relatively few states of the same mechanical system would have under the external mechanical conditions in question. Hereby the fact is clarified that this system takes in the course of time states which do not have these properties, and which one calls disordered. Since by far most of the states of the system are disordered, one calls the latter the probably states.’
I have not found who in the period between Helmholtz’s statement and Boltzmann’s quote was the first to make such a step, but in the beginning of the 20th century the entropy-as-disorder metaphor had become firmly established. For example, Max Planck in his famous papers on black body radiation (1900 and 1901) simply states ‘entropy means disorder’. This is entirely consistent with the currently common idea of the equilibrium state as complete disorder – the most probable state is identified with the most disordered.
Schrodinger: Order, Disorder and Entropy
In his discussion of nonequilibrium processes and life, Schrödinger in the sixth chapter of his book ‘What is Life?‘ starts from the entropy-as-disorder metaphor based on the Boltzmann equation for entropy. Hence the behavior of living organisms appears mysterious, as discussed in the first two sections ‘Order Based on Order‘ and ‘Living Matter Evades the Decay to Equilibrium‘.
It is worth noting the metaphorical nature of Schrödinger’s text already at this level. For example, Schrödinger simultaneously speaks of an isolated system and a system in a uniform environment, but entropy increases in both systems; at the same time, it is impossible to understand what Schrödinger means by a system in a uniform environment. Schrödinger later wants to discuss the entropy increase in the organism, but formally this is valid when the organism is taken as an isolated system. Therefore, it seems that the introduction of the metaphor of a system in a uniform environment corresponds to the simultaneous consideration of the organism as an isolated system and as a system that exchanges matter and energy with the environment.
Schrödinger explains the mysterious behavior of life in the next section with the expressive title ‘It Feeds on Negative Entropy‘. It contains the main conclusion which excites the imagination of intellectuals up to the present time:
‘Thus, a living organism continually increases its entropy – or, as you may say, produces positive entropy – and thus tend to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy – which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy.’
It should be noted that at the same time there is a discussion of the entropy increase (the organism is an isolated system) and the organism interaction with the environment (the organism is not an isolated system). Schrödinger then in the section ‘What is entropy? ‘ explains the entropy as a physical quantity, and in the next section ‘The statistical significance of entropy‘ he gives the Boltzmann equation for entropy:
‘entropy = k log D’
‘D a quantitative measure of the atomistic disorder of the body in question. To give an exact explanation of this quantity D in brief non-technical terms is well-nigh impossible.’
Schrödinger gives examples, of which I will focus later on this one:
‘The gradual “spreading out” of the sugar over all the water available increases the disorder D, and hence (since the logarithm of D increases with D) the entropy.’
It is important to note that the Boltzmann equation applies only to an isolated system. Thus, Schrödinger continues to play on the initial ambiguity of the system in question – whether the organism is an isolated system or not.
In the final section, whose title reflects the central idea, ‘Organization Maintained by Extracting “Order” from the Environment‘, Schrödinger ties it all together:
‘If D is a measure of disorder, its reciprocal, 1/D, can be regarded a direct measure of order. Since the logarithm of 1/D is just minus the logarithm of D, we can write Boltzmann’s equation thus:
– (entropy) = k log (1/D).
Hence the awkward expression “negative entropy” can be replaced by a better one: entropy, taken with a negative sign, is itself a measure of order.’
The idea is clear, although it ultimately seems like a truism. Entropy is disorder, and order is the opposite of disorder. Therefore, negative entropy is equivalent to order that is taken from the environment. The equation written is the culmination of the use of metaphorical language, since one should not expect the entropy of an organism to decrease to zero by importing order from the environment, as follows from the equation written by Schrödinger.
In conclusion, I will note that it seems Schrödinger’s presentation is related to Boltzmann’s statement in the report ‘The Second Law‘ in 1886:
‘The general struggle for existence of animate beings is therefore not a struggle for raw materials — these, for organisms, are air, water and soil, all abundantly available — nor for energy which exists in plenty in any body in the form of heat (albeit unfortunately not transformable), but a struggle for entropy, which becomes available through the transition of energy from the hot sun to the cold earth. In order to exploit this transition as much as possible, plants spread their immense surface of leaves and force the sun’s energy, before it falls to the earth’s temperature, to perform in ways as yet unexplored certain chemical syntheses of which no one in our laboratories has so far the least idea. The products of this chemical kitchen constitute the object of struggle of the animal world.’
Discussion: Order and Disorder
Let me remind you that in the Boltzmann equation the less probable was identified with the ordered state, and the more probable with the disordered one. In the case of a monatomic ideal gas such a transition could be understood in Boltzmann’s thought. Although even in this case, one can find a discrepancy with the every day idea of order and disorder.
Let’s take Schrödinger’s example – the dissolution of sugar in water. The most probable state corresponds to the ideal mixing of sugar molecules between water molecules. Thus, disorder is identified with a homogeneous state of the solution, when there are no no sugar concentration gradients. For comparison, let’s take the coloring of the wall. Consider the final state, when the wall has a uniform color and an intermediate state, when there are separate spots on the wall. The latter corresponds to the intermediate state during sugar dissolution, in which concentration gradients are still present. Most likely, order will mean a uniform color of the wall, and not an intermediate state with spots. Although, these may not be spots at all, but a beautiful pattern. Try, for example, to compare the order in the pattern and in the process of dissolving sugar in water.
Further, the dissolution of sugar in water is a good example of the limitations of the Boltzmann equation for entropy – you just need to add more sugar to the water. At some point, the sugar will stop dissolving and an equilibrium state of two phases appears – a saturated sugar solution and sugar crystals. Moreover, if to prepare a supersaturated sugar solution, then sugar crystals will crystallize from it. Hence, this is an example where the disorder in Schrödinger’s metaphor will spontaneously turn into order.
The last example shows that the behavior of a monatomic ideal gas cannot be transferred to phase equilibria. This, in turn, means that transferring ideas about order and disorder in a monatomic ideal gas to a living organism is even more difficult. It is important to note that Boltzmann’s conclusion cannot be extended to more complex systems. A transition to the Gibbs ensemble method is required but there are new problems there – the Gibbs statistical entropy formally remains constant in a nonequilibrium process in an isolated system (see Chapter 2.3 in Russian ‘The Arrow of Time in Statistical Mechanics‘). It is possible in the general case to give new meaning to the Boltzmann equation for entropy, but then the equation will contain a quantity that will not be related to the number of permutations and that cannot be computed at all.
Thus, an attempt to link order and disorder with thermodynamic entropy is quite a futile endeavor. In this context, let us consider the idea that life is an example of a low-entropy state, since this statement is a consequence of the entropy-as-disorder metaphor. Analysis of such a statement leads to Boltzmann’s original formulation and to the logic as follows: equilibrium is the most probable state, the organism is not in equilibrium, therefore the organism is an unprobable and thus low-entropy state.
However, the discussion of the probability of an organism state requires the choice of the sample space for determining this probability. Boltzmann solved this problem for a monatomic ideal gas – the probability turned out to be proportional to the number of permutations in the μ-space. The use of such representations – probability as a number of permutations – for an organism is impossible even at a qualitative level. This does not work even for a saturated sugar solution, not speaking about a living organism.
Even the question of establishing equilibrium, with which Schrödinger began his discussion, is not that simple. No doubt that equilibrium is quickly established in the case of an ideal gas. But let us take a diamond and ask ourselves whether the diamond is in a state of equilibrium or not. The answer will be ambiguous – the diamond is in thermal and mechanical equilibrium, but not in a state of chemical equilibrium. First, graphite is more thermodynamically stable than diamond. Second, diamond stays in an oxygen atmosphere. In this case, the thermodynamically stable state is carbon dioxide. However, due to kinetic reasons, diamond does not turn into graphite or carbon dioxide and therefore serves as a good investment.
The organism remains also in an oxygen atmosphere. Therefore, formally, carbon dioxide, water, and oxides of other elements should be considered as the equilibrium state of a living organism. However, it is doubtful that it makes sense to include such states in the sample space to determine the probability of the organism being in the current unprobable and therefore ‘low-entropy’ state.
In conclusion, I will say that I do not object to the statement that the organism is in a low-entropy state, I am only saying that in this statement it is unclear what is compared with what.
Discussion: Understanding Chemical Reactions
The discussion of entropy in Boltzmann’s ‘struggle for entropy’, as well as Schrödinger’s original discussion, primarily concerns chemical reactions. In this case, the transition to the Boltzmann entropy equation seems unjustified, since the latter concerns the state of matter without chemical reactions. It is more reasonable to return to Helmholtz’s original metaphor – Helmholtz has discussed namely chemical reactions. At constant temperature and pressure, the free energy is the Gibbs energy, so let us consider the equation for the Gibbs energy change during a chemical reaction:
ΔG = ΔH ‐ T ΔS
The Δ sign denotes the change during a chemical reaction, G is the Gibbs free energy, H is the enthalpy, S is the entropy, T is the absolute temperature. Schrödinger is right that understanding this equation requires technical knowledge. On the other hand, it is impossible to avoid this equation when considering chemical reactions, since the alternative use of the entropy-as-disorder metaphor based on the Boltzmann equation leads to a loss of understanding.
At the level of metaphor, it is possible to say that the enthalpy change characterizes the total energy change, the Gibbs energy change is related to ordered motion during work, and the term T ΔS is related to disordered motion during heat exchange. Such an examination allows us to understand Helmholtz’s thoughts about the connection between entropy and disorder.
However, Helmholtz was a bit hasty. In a particular state of the system, there is still Gibbs energy, enthalpy, temperature and entropy – these are functions of the state. Yet, work and heat are characteristics of the process and they are not functions of the state. A particular state of the system cannot be related to work and heat. Thus, the metaphor of ordered and disordered motion during a chemical reaction cannot be transferred to a particular state of the system.
Now I will write out Schrödinger’s reasoning, in which he rejects the explanation at the level of free energy:
‘How does the living organism avoid decay? The obvious answer is: By eating, drinking, breathing and (in the case of plants) assimilating. The technical term is metabolism. This Greek word means change or exchange. Exchange of what? Originally the underlying idea is, no doubt, exchange of material. (E.g. the German for metabolism is called Stoffwechsel.) That the exchange of material should be the essential thing is absurd. Any atom of nitrogen, oxygen, sulfur, etc., is as good as any other of its kind, what could be gained by exchanging them? For a while in the past our curiosity was silenced by been told that we feed upon energy. In some very advanced country (I don’t remember whether it was Germany or the U.S.A. or both), you could find menu cards in restaurants indicating, in addition to the price, the energy content of every dish. Needless to say, taken literally, this is just as absurd. For an adult organism the energy content is as stationary as its material content. Since each calorie is worth as much as any other calorie, one cannot see how a mere exchange could help.’
It is worth noting the similarity with Boltzmann’s arguments to introduce ‘struggle for entropy’.
Let us consider the arguments of Schrödinger and Boltzmann with examples of the nonliving. For example, the gasoline engine; thermodynamics began with the heat engine, and so this example best shows thermodynamic reasoning in terms of work. Another example is the burning candle; this example shows the formation of structure during combustion. At the beginning understanding of chemical reactions should be made at the level of the nonliving; only then can we attempt to transfer this understanding to the living.
Before this, Schrödinger’s conclusion, which confirms that ultimately we are talking about chemical reactions:
‘Indeed, in the case of higher animals we know very well what kind of orderliness they feed upon well enough, viz. the extremely well-ordered state of matter in more or less complicated organic compounds, which serve them as foodstuffs. After utilizing it they return it in a very much degraded form – not entirely degraded, however, for plants can still make use of it. (These, of course, have their most powerful supply of “negative entropy” in the sunlight.)’
It is clear that this conclusion echoes Boltzmann’s ‘struggle for entropy’, only Schrödinger, to align with the entropy-as-disorder metaphor, gave entropy a negative sign, in order to make order out of disorder. Let us think over how such reasoning looks like when considering the chemical reactions during the operation of a gasoline engine and burning of a candle.
So, there is a device which to operate requires incoming chemical substances. They enter into chemical reactions with atmospheric oxygen, and the reaction products are removed from the device along with the heat produced. The consideration at the level of free energy above is reduced to the energetics of a chemical reaction, which includes ordered motion (change in free energy) and disordered motion (the product of temperature and entropy change).
Let us now try to apply Schrödinger’s logic instead. This raises the unsolvable task to find an increase in disorder in the state of a running gasoline engine and a burning candle. The increase in entropy that Schrödinger speaks of applies only to an isolated system, and the reference to a system in a uniform environment remains unclear. One can only try to imagine for a moment the candle flame and the working engine as an isolated system (entropy increases), and then immediately return to the exchange of energy and matter with the environment (entropy decreases). I do not think that such a representation helps in any way to consider the ongoing chemical reactions.
Boltzmann and Schrödinger rejected consideration at the level of matter exchange. Indeed, during chemical reactions the total number of chemical elements does not change (the mass conservation law at the level of chemical elements). However, such consideration overlooks the course of chemical reactions when some substances are supplied as an input and other substances leave the system. In this sense, the processes in a burning candle and a running motor should be recognized as matter exchange at the level of chemical transformation.
Similarly, Boltzmann and Schrödinger too quickly dismissed the consideration of energy due to the law of energy conservation – they overlooked that during a chemical reaction, work is performed and heat is exchanged. The purpose of introducing free energy is to separate the work performed from the heat exchange. The transition to the level of entropy or negative entropy deprives us of this opportunity.
Moreover, we must not forget that thermodynamics only sets the possible direction of the process, but real processes depends on kinetics. Hence, the choice of the initial substance is associated not only with thermodynamic limitations, it also depends on the kinetics of the processes. For example, using diesel fuel for a gasoline engine is not a good idea. Similarly, not all highly ordered substances in Schrödinger’s consideration are suitable for participation in the organism metabolism.
In thermodynamics when chemical reactions occur, energy and entropy are related to each other, and therefore it is necessary to use the metaphor of free energy, which includes both energy and entropy. The word metaphor in this context means that when moving to mathematical equations, additional nuances appear. I emphasize once again that a correct understanding is impossible without working through the relevant equations.
At the same time, the transition to the language of entropy alone (the entropy-as-disorder metaphor) ignores the connection between energy and entropy and thus leads to misunderstanding. Let me return to the idea of an organism as a low-entropy state. Let us imagine a rapid cooling of the organism to a very low temperature. Entropy will decrease, order will be preserved, but the organism will die. This is another example that shows the futility of using the idea of entropy as disorder.
In conclusion, I note that cosmologists have nowadays returned to Boltzmann’s idea of a ‘struggle for entropy’ and have begun to emphasize the role of low entropy of solar photons; that is, that it is not the photon energy (frequency) that is important, but the entropy; apparently entropy is fascinating. An analysis of these ideas is made in a separate post:
In Russian. Low-entropy energy of the Sun and life: Popular science books by physicists about low-entropy energy of the Sun – Carroll, Green and Penrose. A model problem is analyzed – thermodynamics of radiation, which gives a different view on what is happening.
Information
Erwin Schrödinger, What is Life? 1967. First published in 1944.
Edward J. Yoxen, Where does Schroedinger’s “What is life?” belong in the history of molecular biology? History of science 17, no. 1 (1979): 17-52.
Hermann von Helmholtz, Die Thermodynamik chemischer Vorgänge, Sitzungsberichten der Akademie der Wissenschaften zu Berlin, 1, 22-39, 2 February 1882. S. 34.
Ludwig Boltzmann, Lectures on Gas Theory, Translated by Stephen G. Brush, 1995. p. 400. There is a misprint: ‘does not require’ instead of ‘does require’.
Max Planck, Zur Theorie des Gesetzes der Energieverteilung im Normalspectrum, Verh. Dtsch. Phys. Ges. Berlin, 1900, 2, 237 – 245.
Max Planck, Über des Gesetz der Energieverteilung in Normalspectrum. Ann. Phys., 1901, 4, 553 – 563.
Ludwig Boltzmann, Theoretical Physics And Philosophical Problems Selected Writings, 1974, The Second Law of Thermodynamics, p. 24.
Additional information
In Russian. Joseph Needham: Evolution and Thermodynamics: The 1942 article is still relevant. Needham on separating the discussion of organization in biology from thermodynamics. Eddington and Schrödinger quotes from the article and discussion. Metaphors and thermodynamics.