Previous: Chapter 4. Entropy, Ignorance, and Information
As already mentioned, James Maxwell believed from the very beginning that it was impossible to find a strict proof of the second law within the framework of the kinetic theory. To prove this statement, he invented a creature, later called Maxwell’s demon. Ludwig Boltzmann’s statistical interpretation of the second law basically confirmed that Maxwell was right. In the 19th century, no one thought that it is necessary to fight Maxwell’s demon, as the second law has a statistical nature and there is always a possibility of its violation.
Einstein developed the Brownian motion theory and its experimental confirmation was the first proof of the reality of fluctuations. This discovery drew attention to the possibility of a second-kind perpetual motion machine using fluctuations, and several hypothetical devices were proposed. Marian Smoluchowski considered Maxwell’s demon as a physical device that was by itself susceptible to fluctuations. As a result, Smoluchowski only slightly weakened the formulation of the second law for the joint system of a naturalized Maxwell’s demon and the device controlled by it – it is impossible to produce work in such a system over a long period of time.
Leo Szilard suggested Maxwell’s information demon. Unlike the mechanical demons discussed in Smoluchowski’s work, Szilard’s thought experiment involved an intelligent demon controlling a device by means of measurements. This thought experiment had a significant impact on the development of statistical mechanics in the second half of the 20th century. Physicists decided that Maxwell’s demon was a threat to the second law and therefore exorcism was required.
In the first step of the demon’s exorcism, Leon Brillouin used the negentropy principle of information. This consideration confirmed Szilard’s thoughts that entropy increase is due to measurement, and hence the change in entropy is related to the information gained during the measurement. The second step of the demon’s exorcism was conducted within the framework of thermodynamics of computation. Rolf Landauer proposed the concept of minimal entropy change during memory erasure, and Charles Bennett used Landauer’s principle to provide a new explanation for the impossibility of Maxwell’s demon. According to Bennett, a measurement can be made without entropy change, and the entropy increase is due to the memory erasure inside the demon.
- Maxwell’s demon
- Brownian motion and the second law
- Smoluchowski: naturalization of Maxwell’s demon
- Szilard’s demon
- Brillouin: negentropy principle of information
- Landauer and thermodynamics of computation
Maxwell’s demon
In a letter to Peter Tait in 1867, Maxwell described a device containing a tiny creature that William Thomson (later known as Kelvin) dubbed a demon. Maxwell explicitly expressed this idea in a letter in 1870 to John Strutt (later known as Lord Rayleigh), where he emphasized the time symmetry of the laws of mechanics. Thus, it is entirely possible to transfer heat from a cold body to a hot one by reversing time in the laws of mechanics. But since such an action was impossible in reality, the thought experiment with the demon made the argument more plausible.
Maxwell’s demon appeared publicly in 1871 in the book ‘Theory of Heat‘. Interestingly, Maxwell compared the creature to a pointsman, and the pointsman was Maxwell’s favorite figure to discuss free will. Maxwell was a devout Christian, and he believed that scientific knowledge was compatible with religion and that hence in science there was room for free will. However, this aspect would take us away from the main topic.
Maxwell was satisfied with an idea of the statistical nature of the second law without examining the details. However, in 1878 in the paper ‘Diffusion‘ for the Encyclopaedia Britannica, Maxwell related this problem with the status of human knowledge:
‘It is only to a being in the intermediate stage, who can lay hold of some forms of energy while others elude his grasp, that energy appears to be passing inevitably from the available to the dissipated state.’
This means that for a creature that could monitor the movement of molecules (Maxwell’s demon), this problem would not exist:
‘the notion of dissipated energy could not occur … to one who could trace the motion of every molecule and seize it at the right moment.’
Brownian motion and the second law
The study of Brownian motion changed attitude of physicists to Maxwell’s demon. An understanding of the connection between Brownian and molecular motion appeared in the last quarter of the 19th century. Below there is a couple of quotes from Gelfer’s book:
‘In a clear form, the assumption that Brownian motion is caused by the motion of liquid molecules was independently made by Carbonnelle (1874) and Ramsay (1876). In 1877, Desaulx came to the same conclusion … It should be noted that some quantitative patterns of Brownian motion were observed by F. Exner in 1867.’
‘Later, in 1888, L. Gouy, found that the intensity of particle motion is inversely proportional to the viscosity of the liquid and directly proportional to its temperature. In 1900, F. Exner attempted to provide an analytical calculation of the velocity of a Brownian particle, assuming that its kinetic energy is equal to the kinetic energy of a gas molecule. However, the resulting velocities did not match the observed values.’
Gelfer expresses the problem of experimental study of Brownian motion at that time. Physicists believed that it was possible to measure the speed of movement of small particles, but this path led to a discrepancy between the observed values and the theory predictions. This may have been the reason that Boltzmann had no interest in Brownian motion.
The situation changed after the theory developed by Einstein and Smoluchowski. Now, the displacement of a particle was used as an observable quantity, paving the way for experimental verification of the theory. This also demonstrates that the development of physics does not follow the inductive method.
During the study of Brownian motion, the possibility of a second-kind perpetual motion machine based on fluctuations was discussed. Again a quote from Gelfer’s book:
‘There was a debate about Thomson’s statement (“a second-kind perpetual motion machine is impossible”). Some physicists, such as Lippmann, Svedberg, and Ostwald, believed that fluctuation phenomena allowed for the possibility of a second-kind perpetual motion machine, at least in principle. It seemed that Maxwell’s idea of “demons” sorting molecules based on their velocities found theoretical support in Brownian motion. Gouy had once suggested that if the Brownian motion could be organized in some way, it would open up the possibility of obtaining free energy. While Gouy’s suggestion was a hypothetical proposal, Ostwald explicitly stated in 1906 that the second law could be disproved:
“It seems to us that Maxwell’s ‘demons’, which in the molecular domain could be considered harmless, have an open field for experimental refutation of the second law in the finite domain of visible phenomena.”
Specific schemes for implementing a second-kind perpetual motion machine were even proposed, most of which were based on one or another variant of Maxwell’s “demons”.’
Smoluchowski: naturalization of Maxwell’s demon
Let us consider the position of Marian Smoluchowski, whose paper titles were quite provocative: ‘Experimentally Demonstrable Molecular Phenomena that Contradict Ordinary Thermodynamics‘ (1912), ‘The Limits of the Second Law of Thermodynamics‘ (1914).
In classical thermodynamics, strictly speaking, fluctuations are impossible. In this sense, Brownian motion and the presence of fluctuations contradict the second law of thermodynamics. However, after discussing this fact, Smoluchowski naturalized Maxwell’s demon. The demon is declared to be a device that obeys the laws of physics and is subject by itself to fluctuations. With this, Smoluchowski demonstrated the problematic nature of proposed second-kind perpetual motion machines, as the presence of fluctuations in Maxwell’s demon makes it impossible for these machines to operate continuously. To preserve the second law, the formulation must be slightly modified to take into account fluctuations:
‘There can be no automatic device that would produce continuously usable work at the expense of the lowest temperature.’
Smoluchowski associates the only possibility of creating a perpetual motion machine of the second kind with pure intellect (from the 1914 paper):
‘Therefore a perpetuum mobile is possible in case one considers, in accord with the customary methods of physics, the experimenting person as a kind of ‘Deus ex machina’ that is informed continually and exactly of the momentary state of nature and can set in motion or interrupt microscopic natural processes at any moment without expenditure of work. Thus he would not at all need to possess the capacity of a Maxwell Demon to intercept individual molecules, but he would definitely still be distinct from real living beings in the above points. For, the production of some physical effect through operation of the sensory and also motor nervous system is always associated with an energy cost, independent of the fact that its entire existence is tied with a continuous dissipation of the same.’
In the next paragraph, Smoluchowski leaves room for doubt:
‘Therefore, in view of these circumstances, that real living beings could produce work continuously, or at least in a regular fashion, at the expense of the heat of the lowest temperature, appear truly doubtful, although our knowledge of living processes exclude a definite answer.’
But then the next paragraph begins with the statement: ‘The issues raised at the end go beyond the scope of physics itself.’
Smoluchowski’s solution was successful. A perpetual motion machine of the second kind that operates for an extended period has not been created yet. Let me quote from a book with the expressive title ‘Challenges to the Second Law of Thermodynamics‘ published in 2005; it examines many proposals for a perpetual motion machine of the second kind:
‘In this volume we will attempt to remain clear on this point; that is, while the second law might be potentially violable, it has not been violated in practice.’
Szilard’s demon
Szilard is known for a thought experiment that played a major role in attempts of physicists in the second half of the 20th century to relate information and computation to physical processes.
In 1925, Szilard wrote a paper on a phenomenological theory of fluctuations. A description of the development of these ideas by Szilard can be found in the book by Yu. G. Rudoi, ‘Mathematical Structure of Equilibrium Thermodynamics and Statistical Mechanics‘ (In Russian, 2013). The proposed devices discussed by Smoluchowski and subject to the formalism of Szilard’s fluctuations can be referred to as mechanical demons. They however did not contain information in the direct form, and thus the possibility of correlations between fluctuations in the demon and in the system have not been explored.
This seems was the motivation for the 1929 paper ‘On the Decrease of Entropy in a Thermodynamic System by the Intervention of Intelligent Beings‘. Szilard considered a thought experiment that influenced the development of physics in the second half of the 20th century. Szilard reduced the system in question to a single molecule and introduced a being that, after detecting the molecule’s location in the left or right part of the volume, uses this information to obtain useful work. During one cycle of the controlled subsystem, heat is converted into work and thus the entropy of the subsystem is reduced. This shows the potential role of information about fluctuations to generate work.
The mechanical demon is transformed into an information demon — measurement, information processing, and action. In my book the term Szilard’s demon is used to emphasize the difference between this device and Maxwell’s mechanical demons; the Szilard demon is an information-based version of Maxwell’s mechanical demon. Szilard assumed that the second law of thermodynamics would not be violated in the whole system due to the increase in entropy during measurement of the molecule position. Thus, thermodynamic entropy was associated with the process of obtaining information about a subsystem during measurement.
Brillouin: negentropy principle of information
The first physicist to unify Shannon’s information theory and statistical mechanics was Léon Brillouin (1889-1969). In a series of papers from 1951, he analyzed Maxwell’s information demons, including Szilard’s thought experiment, and in 1956, he published the book ‘Science and Information Theory‘.
Brillouin believed that the best expression for energy degradation would be a negative value of entropy, which he called negentropy. At the same time, in the book negentropy is defined formally as a negative value of entropy:
‘Negentropy (N = —S) represents the quality or grade of energy, and must always decrease.’
Next, Brillouin introduces the negentropy principle of information. He equates the information change with the change in the number of microstates in the Boltzmann equation. From this he concludes:
‘bound information = decrease in entropy S = increase in negentropy N’
I must admit that I could not follow Brillouin’s treatment, since all the equations formally related to the entropy change, but sometimes the entropy change was called negentropy, and sometimes information. I could not understand what a name should be used when; for me all the changes remained as entropy changes.
In examples, Brillouin considered the spontaneous expansion of a gas into a vacuum and the mixing of two gases. Below there is a description of the first process without equations:
‘Let us suppose that we have additional information on the state of the gas: for instance, we may happen to know that the gas, at a certain earlier instant of time, occupied a smaller volume V1. This is the case if the gas is in a container V1 and we suddenly open the connection with another volume V2 … The initial entropy S1 is smaller that the entropy S after expansion … [information is measured as the difference in entropies] After we open the volume V2, the gas flows in, density oscillations take place between the two volumes, and the steady state is progressively established with a uniform density throughout the volume V. Increase of entropy and loss of information proceed together. We may say that the gas progressively “forgets” the information.’
For gas mixing, the conclusion was similar: ‘The increase in entropy corresponds to a loss of information.’ From my perspective, the use of the term ‘information’ in this context is just a metaphor.
In any case, Brillouin used the negentropy principle of information to analyze Maxwell’s information demons. The analysis confirmed the role of measurements; the entropy decrease of the controlled subsystem during the demon’s operation is compensated by an entropy increase in the whole system, including the demon, during the measurement process. Brillouin’s papers and the book led to a consensus among physicists for the next a couple of decades. The acquisition of new information during measurements is associated with a change in thermodynamic entropy, and this was considered as the solution to expel Szilard’s demon.
Landauer and thermodynamics of computation
In parallel, there was a discussion about the minimum costs associated with computations. For some certain logical operations, it was impossible to return to the initial state, and these logical operations were referred to as irreversible. In 1961, Rolf Landauer connected logical irreversibility with physical irreversibility and analyzed the operation of writing a single bit to memory. In this process, there is no verification of the memory state, as it would require additional costs. Therefore, the operation of writing a bit is irreversible, as it makes the previous content unknown (memory erasure). As a result, Landauer proposed the principle that such an operation must be accompanied by the release of a minimum amount of heat associated with a change in thermodynamic entropy.
Later, Charles Bennett and others showed that it is possible to run computations by means of reversible logical operations only. Thus, memory erasure remained the only logical irreversible operation. As a result, Bennett proposed a new analysis of Szilard’s demon in the early 1980s, concluding that measurements can be performed without the entropy change, and thus associating the necessary increase in entropy with memory erasure. According to Bennett, the demon would have to write the measurement result to memory, and at the end of the cycle, the memory cell would have to be initialized by memory erasure. The new interpretation of Szilard’s demon prevailed, although there was additional discussion about the reasons for a mistake made by the previous generation of physicists, led by Brillouin.
In the late 1980s, Wojciech Zurek provided the final touch to this story. Zurek pointed out that Bennett’s entropy balance is only established at the end of the process, and he suggested expanding Bennett’s analysis to include algorithmic entropy. Zurek proposed that Szilard’s demon uses a specific algorithm based on reversible computations to perform the task. By incorporating algorithmic entropy into the analysis, Zurek has shown that the entropy balance remains constant throughout the process.
In the third part of the book, I will return to this issue in Chapter 3.5 ‘Critique of Thermodynamics of Information‘. For now, let me just say a few words with an example of a burning candle. According to Brillouin, the burning of a candle involves the loss of information, and one has to understand the meaning of the word information in this context. In thermodynamics of computation, the question is of how to determine the presence of computation in a physical process. When a candle is burning, it doesn’t look like that any calculation is going on, but that is precisely the question: how can we distinguish between physical processes involving calculations and those without them.
Next: Chapter 6. Non-equilibrium States in Statistical Mechanics
References
Martin J. Klein, Maxwell, His Demon, and the Second Law of Thermodynamics: Maxwell saw the second law as statistical, illustrated it with his demon, but never developed its theory. American scientist 58, no. 1 (1970): 84-97.
Ya. M. Gelfer, History and Methodology of Thermodynamics and Statistical Physics (in Russian), 2nd ed., 1981, Chapter 12, Discovery and Study of Brownian Motion. Further Development of Boltzmann’s Statistical Theory.
John Earman and John D. Norton. EXORCIST XIV: the wrath of Maxwell’s demon. Part I. From Maxwell to Szilard. Studies In History and Philosophy of Science Part B: Studies In History and Philosophy of Modern Physics 29, no. 4 (1998): 435-471.
Vladislav Capek and Daniel P. Sheehan. Challenges to the second law of thermodynamics, 2005.
Lеon Brillouin, Science and Information Theory, 1956.
John Earman and John D. Norton. EXORCIST XIV: the wrath of Maxwell’s demon. Part II. From Szilard to Landauer and Beyond. Studies In History and Philosophy of Science Part B: Studies In History and Philosophy of Modern Physics 30, no. 1 (1999): 1 — 40.
Discussion