Entropy and Information

In the modern scientific culture, the statement that the entropy and information are the same things is ubiquitous. For example the famous paper of Edwin T. Jaynes [1] according to Google Scholar [2] has been cited more than 5000 times. The list of books from Prof Gordon [3] about the deep relationship between entropy and information in the biology is about 30 titles and I am pretty sure that one can find even more book along this line.

On the other hand, if I consider my personal area of expertise: thermodynamics and experimental thermodynamics, information as such is not there. A simple example. Below is the Fe-C phase diagram as computed according to the CALPHAD approach (see for example [4], picture is used to be on www.calphad.com). The thermodynamic entropy has been employed, the information concept not.

http://blog.rudnyi.ru/wp-content/uploads/2012/02/FeC.png

Hence we have an interesting situation. Everybody is convinced that the entropy is information but when we look at the thermodynamic research, the information is not there. How it could happen? As usual, a shoemaker has no shoes?

I had several discussions about this issue, first on the biotaconv list [5], then on the embryophysics list [6], and finally on the everything list [7, 8, 9 (now deleted)]. Below there is a summary.

I will start with a short historical overview in order to show how the intimate relationship between the entropy and information has been developed. Next I will briefly overview experimental thermodynamics, as in my view it is important to understand that when we talk about the thermodynamic entropy, it has been measured and tabulated the same way as other thermodynamic values. After that, I will list my examples to clarify the relationship between thermodynamic entropy and information entropy and finally I present my personal opinion on this issue.

Historical Perspective on Entropy and Information

The thermodynamics and entropy have been developed in order to describe heat engines. After that chemists have found many creative uses of thermodynamics and entropy in chemistry. Most often chemists employ thermodynamics in order to compute equilibrium composition, what should happen at the end when some species are mixed with each other, see for example figure with the phase diagram above.

The development of classical thermodynamics was not an easy born (see for example Truesdell [10]) and many people find classical thermodynamics difficult to understand. No doubt, the Second Law here is the reason; people find it at least as non intuitive. Please note that the classical thermodynamics is a phenomenological science and it does not even require an assumption of the atomic hypothesis. Hence when the existence of atoms has been proved, statistical thermodynamics based on the atomic theory has been developed and there was the hope to find a good and intuitive way to introduce entropy. Unfortunately, it did not happen.

In order to explain you this, let us consider a simple experiment. We bring a glass of hot water in the room and leave it there. Eventually the temperature of the water will be equal to the ambient temperature. In classical thermodynamics, this process is considered as irreversible, that is, the Second Law forbids that the temperature in the glass will be hot again spontaneously. It is in complete agreement with our experience, so one would expect the same from statistical mechanics. However there the entropy has statistical meaning and there is a nonzero chance that the water will be hot again. Moreover, there is a theorem (Poincaré recurrence) that states that if we wait long enough then the temperature of the glass must be hot again. No doubt, the chances are very small and time to wait is very long, in a way this is negligible. Some people are happy with such statistical explanation, some not.

In any case, statistical mechanics has changed nothing in practical applications of thermodynamics, rather it helps to derive missing information from atomic properties. We will not find information as such in the classical works of Boltzmann and Gibbs on statistical thermodynamics, that means that statistical thermodynamics without information has existed for quite awhile.

Shannon has introduced the information entropy in his famous paper [11] where he writes

The form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics where pi is the probability of a system being in cell i of its phase space. H is then, for example, the H in Boltzmann s famous H theorem.”

Please note that Shannon has just shown that the equation employed to the problems of information transfer is similar to that in statistical thermodynamics. He has not made a statement about the meaning of such a similarity, that is, he did not identify his entropy as the thermodynamic entropy. He has just used the same term, nothing more. Yet, some confusion was already there as since then we have a similar equations and two similar terms, the thermodynamic entropy and information entropy. In any case, such a state with a similar equation describing two different phenomena took again awhile.

Edwin T. Jaynes has made the final step in [1]

p. 622 after eq (2-3) (this is the Shannon equation) “Since this is just the expression for entropy as found in statistical mechanics, it will be called the entropy of the probability distribution p_i; henceforth we will consider the terms “entropy” and “uncertainty” as synonymous.”

This is exactly the logic that brought the deep relationship between Shannon’s information and the thermodynamic entropy. As the two equations are the same, they should describe the same phenomenon, that is, information is the thermodynamic entropy.

Russell Standish [12] has listed other authors contributed to binding the thermodynamic entropy and information:

Because I tend to think of “negentropy”, which is really another term for information, I tend to give priority to Schroedinger who wrote about the topic in the early 40s. But Jaynes was certainly instrumental in establishing the information based foundations to statistical physics, even before information was properly defined (it wasn’t really until the likes of Kolmogorov, Chaitin and Solomonoff in the 60s that information was really understood).

But Landauer in the late 60s was probably the first to make physicists really wake up to the concept of physical information.”

Anyway in all discussions, I have seen a single logic: if the equation for the entropy in statistical mechanics is the same as for the information entropy in Shannon’s paper, then the entropy is information. I will give my opinion on this later in the section discussion. Right now it is enough to say that at present we have three terms, information in IT, information in physics and the thermodynamic entropy. Some people consider these three terms as synonyms, and some not.

Experimental Thermodynamics

I have already mentioned that thermodynamics is employed extensively to solve practical problems in engineering and chemistry. Let me quote a couple of paragraphs from the Preface to the JANAF Tables [13] (ca. 230 Mb):

Beginning in the mid-1950s, when elements other than the conventional carbon, hydrogen, oxygen, nitrogen, chlorine, and fluorine came into consideration as rocket propellant ingredients, formidable difficulties were encountered in conducting rigorous theoretical performance calculations for these new propellants. The first major problem was the calculational technique. The second was the lack of accurate thermodynamic data.

By the end of 1959, the calculation technique problem had been substantially resolved by applying the method of minimization of free energy to large, high speed digital computers. At this point the calculations become as accurate as the thermodynamic data upon which they were based. However, serious gaps were present in the available data: For propellant ingredients, only the standard heat of formation is required to conduct a performance calculation. For combustion products, the enthalpy and entropy must be known, as a function of temperature, in addition to the standard heat of formation.

In order to solve the second problem there was extensive development in experimental thermodynamics and the results are presented in thermodynamics tables, most famous being the JANAF Thermochemical Tables (Joint Army-Naval-Air Force Thermochemical Tables) [13] (ca. 230 Mb).

As the name says, the JANAF Tables have been originally developed for military. I guess that the very first edition was classified. Yet, there are many peaceful applications as well and chemists all over the world use these tables nowadays to predict equilibrium composition of a system in question. Among other properties, the JANAF Tables contains the entropy. I believe that this is a very good starting point for everybody that would like to talk about the entropy, just to take the JANAF Tables and see that chemists have successfully measured the entropy for a lot of compounds. As the JANAF Tables are pretty big (almost 2000 pages), a simpler starting point is the CODATA Tables [14].

In essence, the entropy in chemical thermodynamics is a quantitative property that has been measured and tabulated for many substances. You may want to think it this way: chemists have been using thermodynamics and entropy for a long time to create reliable processes to obtain substances needed and they have been successful. I highly recommend you at this point to download the JANAF Tables and browse them. If we talk about the thermodynamic entropy, this is a pretty good starting point.

Practical Examples to Think Over

In my view, the best way to discuss a theory is to try to use it on simple practical examples. To this end, below you will find examples to the thermodynamic entropy, information entropy and the number of states in physics.

1. The Thermodynamic Entropy: What is information in these examples?

1.1) From CODATA [14] tables

S ° (298.15 K) J K-1 mol-1

Ag  cr  42.55 ± 0.20
Al  cr  28.30 ± 0.10

What these values tell us about information?

1.2) At constant volume dS = (Cv/T) dT and dU = CvdT. Does Cv is related to information? Does the internal energy is related to information?

1.3) In the JANAF Tables there is a column for the entropy as well as for the enthalpy (H = U + pV). The latter could be safely considered as energy. How people obtain the entropy in the JANAF Tables? The answer is that they measure the heat capacity and then take the integral at constant pressure of one atmosphere

S_T = Integral_from_0_to_T Cp/T dT

If there are phase transitions then it is necessary to add Del H_ph_tr/T_ph_tr. At the same time the change in the enthalpy

H_T – H_0 = Integral_from_0_to_T Cp dT

Here there is a question to think over. What is the difference between Integral Cp/T dT and Integral Cp dT? Should both have something to do with information or only the first one?

1.4) Problem. Given temperature, pressure, and initial number of moles of NH3, N2 and H2, compute the equilibrium composition. The thermodynamic entropy in this example is present. What is the meaning of information?

To solve the problem one should find thermodynamic properties of NH3, N2 and H2 for example in the JANAF Tables and then compute the equilibrium constant.

From thermodynamics tables (all values are molar values for the standard pressure 1 bar, I have omitted the symbol o for simplicity but it is very important not to forget it):

Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)

2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) – 2 Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(H2) – 2 S_298(NH3)

Del_Cp_r = Cp(N2) + 3 Cp(H2) – 2 Cp(NH3)

To make life simple, I will assume below that Del_Cp_r = 0, but it is not a big deal to extend the equations to include heat capacities as well.

Del_G_r_T = Del_H_r_298 – T Del_S_r_298

Del_G_r_T = – R T ln Kp

When Kp, total pressure and the initial number of moles are given, it is rather straightforward to compute equilibrium composition. If you need help, please just let me know.

1.5) The phase diagram of Fe-C shown in Introduction. It has been computed from the Thermocalc database [4] that contains entropies found by experimental thermodynamics. What is the meaning of information?

2. The Information Entropy: What is the thermodynamic system in these examples?

2.1) The example of the entropy used by engineers in informatics has been given by Jason [15] and I will quote him below. Could you please tell me, the thermodynamic entropy of what is discussed in his example?

On 03.02.2012 00:14 Jason Resch said the following:


> Sure, I could give a few examples as this somewhat intersects with my
> line of work.
>
> The NIST 800-90 recommendation
> for random number generators is a document for engineers implementing
> secure pseudo-random number generators.  An example of where it is
> important is when considering entropy sources for seeding a random
> number generator.  If you use something completely random, like a
> fair coin toss, each toss provides 1 bit of entropy.  The formula is
> -log2(predictability).  With a coin flip, you have at best a .5
> chance of correctly guessing it, and -log2(.5) = 1.  If you used a
> die roll, then each die roll would provide -log2(1/6) = 2.58 bits of
> entropy.  The ability to measure unpredictability is necessary to
> ensure, for example, that a cryptographic key is at least as
> difficult to predict the random inputs that went into generating it
> as it would be to brute force the key.
>
> In addition to security, entropy is also an important concept in the
> field of data compression.  The amount of entropy in a given bit
> string represents the theoretical minimum number of bits it takes to
> represent the information.  If 100 bits contain 100 bits of entropy,
> then there is no compression algorithm that can represent those 100
> bits with fewer than 100 bits.  However, if a 100 bit string contains
> only 50 bits of entropy, you could compress it to 50 bits.  For
> example, let’s say you had 100 coin flips from an unfair coin.  This
> unfair coin comes up heads 90% of the time.  Each flip represents
> -log2(.9) = 0.152 bits of entropy.  Thus, a sequence of 100 coin
> flips with this biased coin could be represent with 16 bits.  There
> is only 15.2 bits of information / entropy contained in that 100 bit
> long sequence.

2.2) This is a paper from control

J.C. Willems and H.L. Trentelman, H_inf control in a behavioral context: The full information case IEEE Transactions on Automatic Control, Volume 44, pages 521-536, 1999

The term information is there. What is the related thermodynamic entropy?

2.3)  In my understanding, when we consider an algorithm, this is a pure IT construct, that does not depend whether I will implement it with an abacus or some Turing machine, with Intel or PowerPC processor. From this follows that the algorithm and hence its information entropy does not depend on temperature or pressure of a physical system that does the computation. In my view it makes sense.

Let us consider consciousness now. Our brains produces it and our brain has some thermodynamic entropy. If we assume that the same effect could be achieved with some robot, does it mean that the thermodynamic entropy of the robot must be the same as that of the brain?

2.4) Information as a representation on a physical object.

Let us consider a string “10″ for simplicity. Let us consider the next cases. I will cite first the thermodynamic properties of Ag and Al from CODATA tables (we will need them)

S ° (298.15 K) J K-1 mol-1

Ag  cr  42.55 ± 0.20
Al  cr  28.30 ± 0.10

In J K-1 cm-3 it will be

Ag  cr  42.55/107.87*10.49 = 4.14
Al  cr  28.30/26.98*2.7 = 2.83

A) An abstract string “10″.

B) Let us make now an aluminum plate (a page) with “10″ hammered on it (as on a coin) of the total volume 10 cm^3. The thermodynamic entropy is then 28.3 J/K.

C) Let us make now a silver plate (a page) with “10″ hammered on it (as on a coin) of the total volume 10 cm^3. The thermodynamic entropy is then 41.4 J/K.

D) We can easily make another aluminum plate (scaling all dimensions from 2) to the total volume of 100 cm^3. Then the thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a string “10″ and the thermodynamic entropy is different. Any comment?

3. Information in Physics: the number of states

The thermodynamic entropy could be considered a measure of number of states and one could say that this is the information in physics. I will quote Brent’s comment to problem 2.4 [16]:

The thermodynamic entropy is a measure of the information required to locate the possible states of the plates in the phase space of atomic configurations constituting them“.

This is formally correct but then the question is the relationship of a number of states with information in IT.

3.1) It would certainly interesting to consider what happens when we increase or decrease the temperature (in the limit to zero Kelvin, according to the Third Law the entropy will be zero at zero Kelvin). What do you think, can we save less information on a copper plate at low temperatures as compared with higher temperatures? Or more?

If engineers would take the statement “the maximum possible value for information increases with temperature” literally, they should operate a hard disk at higher temperatures (the higher the better according to such a statement). Yet this does not happens. Do you know why?

If I operate my memory stick in some reasonable range of temperatures, the information it contains does not change. Yet, the entropy in my view changes. Why it happens this way?

3.2) My example would be Millipede

I am pretty sure that when IBM engineers develop it, they do not employ the thermodynamic entropy to estimate its information capabilities. Also, the increase of temperature would be destroy saved information there.

3.3) In general we are surrounded devices that store information (hard discs, memory sticks, DVD, etc.). The information that these devices can store, I believe, is known with accuracy to one bit. Can you suggest a thermodynamic state which entropy gives us exactly that amount of information?

3.4) Let us consider a coin and let us imagine that the temperature is going to zero Kelvin. What happens with the text imprinted on the coin in this case?

Conclusion

The examples presented above betray my personal opinion. Yes, I believe that the thermodynamic entropy and information entropy do not relate to each other. Personally I find the reasoning clumsy. In my view, the same mathematical structure of equations does not say that the phenomena are related. For example the Poisson equation for electrostatics is mathematically equivalent to the stationary heat conduction equation. What does it mean? Well, this allows a creative way to solve an electrostatic problem for people who have a thermal FEM solver and do not have an electrostatic solver. They can solve an electrostatic problem by using a thermal FEM solver by means of mathematical analogy. This does happen but I doubt that we could state that the stationary heat conduction is equivalent to electrostatics.

On the everything list, Brent has recommended me the paper [17]. In the paper, the information entropy and the thermodynamic entropy are considered and the conclusion was as follows.

p. 28(142) “First, all notions of entropy discussed in this essay, except the thermodynamic and the topological entropy, can be understood as variants of some information-theoretic notion of entropy.”

I understand it this way. When I am working with gas, liquid or solid at the level of experimental thermodynamics, the information according to the authors is not there (at this point I am in agreement with them). Yet, as soon as theoretical physicists start thinking about these objects, they happen to be fully filled with information.

Alternatively one could say that the viewpoint “information and the entropy are the same” brings us useful conjectures. However, no one told me exactly what useful conjectures follow. In discussion on the biotaconv [5], I have suggested to consider two statements

  • The thermodynamic and information entropies are equivalent.
  • The thermodynamic and information entropies are completely different.

and have asked what the difference it makes in the artificial life research. I still do not know the answer.

I would like to conclude by a quote from Arnheim [18]. In my view, it nicely characterizes the current status of relationship between the entropy and information.

“The absurd consequences of neglecting structure but using the concept of order just the same are evident if one examines the present terminology of information theory. Here order is described as the carrier of information, because information is defined as the opposite of entropy, and entropy is a measure of disorder. To transmit information means to induce order. This sounds reasonable enough. Next, since entropy grows with the probability of a state of affairs, information does the opposite: it increases with its improbability. The less likely an event is to happen, the more information does its occurrence represent. This again seems reasonable. Now what sort of sequence of events will be least predictable and therefore carry a maximum of information? Obviously a totally disordered one, since when we are confronted with chaos we can never predict what will happen next. The conclusion is that total disorder provides a maximum of information; and since information is measured by order, a maximum of order is conveyed by a maximum of disorder. Obviously, this is a Babylonian muddle. Somebody or something has confounded our language.”

References

  1. E. T. Jaynes, Information theory and statistical mechanics, Phys. Rev. Part I: 106, 620–630 (1957), Part II: 108, 171–190 (1957)
  2. Google Scholar, http://scholar.google.com/scholar?q=E.+T.+Jaynes
  3. Books on entropy and information, http://blog.rudnyi.ru/2012/02/books-on-entropy-and-information.html
  4. J-O Andersson, Thomas Helander,Lars Hdghmd, Pingfang Shi, Bo Sundman, THERMO-CALC & DICTRA, Computational Tools For Materials Science, Calphad, Vol. 26, No. 2, pp. 273-312, 2002
  5. Entropy and Artificial Life, see secion below.
  6. Entropy and Information, http://groups.google.com/group/embryophysics/t/a14b0a6b9294cf3
  7. deleted.
  8. deleted.
  9. deleted.
  10. C. Truesdell, The Tragicomical History of Thermodynamics, 1822-1854 (Studies in the History of Mathematics and Physical Sciences), 1980.
  11. C. E. Shannon, (1948). A mathematical theory of communication [corrected version]. Bell System Technical Journal 27, 379–423, 623–656.
  12. Russell Standish, http://groups.google.com/group/everything-list/msg/fc0272531101a9e6
  13. NIST-JANAF Thermochemical Tables, Fourth Edition, J. Phys. Chem. Ref. Data, Monograph No. 9, 1998.
  14. J. D. Cox,  D. D. Wagman,  V. A. Medvedev, CODATA Key Values for Thermodynamics, Hemisphere Publishing Corp., New York, 1989.
  15. Jason Resch, http://groups.google.com/group/everything-list/msg/c2da04dbd4ce2f8d
  16. Brent, http://groups.google.com/group/everything-list/msg/bd727f700d0d58c0
  17. Roman Frigg, Charlotte Werndl, Entropy: Guide for perplexed, in Claus Beisbart and Stephan Hartmann (eds.): Probability in Physics, Oxford University Press, 2011, 115-42.
  18. Rudolf Arnheim, Entropy and Art: An Essay on Disorder and Order, 1971

Discussion:

http://groups.google.com/group/everything-list/t/240cc3ac8f614f6d

http://groups.google.com/group/embryophysics/t/419d3c1fec30e3b5

25.12.2010 Entropy and Artificial Life

Introduction

I used to work in chemical thermodynamics for quite a while. No doubt, I have heard of the informational entropy but I have always thought that it has nothing to do with the entropy in chemical thermodynamics. Recently I have started reading papers on artificial life and it came to me as a complete surprise that so many people there consider that the thermodynamic entropy and the informational entropy are the same. Let me cite for example a few sentences from Christoph Adami “Introduction to Artificial Life”:

p. 94 “Entropy is a measure of the disorder present in a system, or alternatively, a measure of our lack of knowledge about this system.”

p. 96 “If an observer gains knowledge about the system and thus determines that a number of states that were previously deemed probable are in fact unlikely, the entropy of the system (which now has turned into a conditional entropy), is lowered, simply because the number of different possible states in the lower. (Note that such a change in uncertainty is usually due to a measurement).

p. 97 “Clearly, the entropy can also depend on what we consider “different”. For example, one may count states as different that differ by, at most, del_x in some observable x (for example, the color of a ball drawn from an ensemble of differently shaded balls in an urn). Such entropies are then called fine-grained (if del_x is small), or course-grained (if del_x is large) entropies.”

This is completely different entropy as compared with that used in chemical thermodynamics. The goal of this document is hence to demonstrate this fact.

I have had a discussion about these matters on biotaconv that helped me to understand better the point of view accepted in the artificial life community. I am thankful to the members of the list for a nice discussion.

I will start with a short description of how the entropy is employed in chemical thermodynamics, then I will briefly review the reasons for an opinion that the thermodynamic and information entropies are the same, and finally I will try to show that information and subjectivity have nothing to do with the entropy in chemical thermodynamics.

Entropy in Chemical Thermodynamics

The thermodynamics and entropy have been developed in order to describe heat engines. After that chemists have found many creative uses of thermodynamics and entropy in chemistry. Most often chemists employ thermodynamics in order to compute equilibrium composition, what should happen at the end when some species are mixed with each other. To this end there are thermodynamics tables, most famous being the JANAF Thermochemical Tables (Joint Army-Naval-Air Force Thermochemical Tables).

As the name says, the JANAF Tables have been originally developed for military. I guess that the very first edition was classified. Yet, there are many peaceful applications as well and chemists all over the world use these tables nowadays to predict equilibrium composition of a system in question. Among other properties, the JANAF Tables contains the entropy. I believe that this is a very good starting point for everybody that would like to talk about the entropy, just to take the JANAF Tables and see that chemists have successfully measured the entropy for a lot of compounds. As the JANAF Tables are pretty big (almost 2000 pages), a simpler starting point is the CODATA Tables.

In essence, the entropy in chemical thermodynamics is a quantitative property that does not depend on an observer. One can close eyes or open them, one can know about the JANAF Tables or not: this does not influence the entropy of substances at all. You may want to think it this way: chemists have been using thermodynamics and entropy for a long time to create reliable processes to obtain substances needed and they have been successful.

Entropy and Information

The members of biotaconv brought my attention to works of Edwin T. Jaynes who was presumably the first to show the equivalence between the thermodynamics and information entropies. His papers Information theory and statistical mechanics (Part I and II) are available in Internet (the links are in Wikipedia)

http://upload.wikimedia.org/wikipedia/commons/thumb/b/b0/ETJaynes1.jpg/200px-ETJaynes1.jpghttp://en.wikipedia.org/wiki/Edwin_Thompson_Jaynes

In the papers, the author considers statistical mechanics only, so let me first describe the relationship between classical and statistical thermodynamics. As the name says it, classical thermodynamics was created first. It was not an easy born (see for example, Truesdell’s The Tragicomical History of Thermodynamics) and many people find classical thermodynamics difficult to understand. No doubt, the Second Law here is the reason; people find it at least as non intuitive. On the other hand, statistical thermodynamics was based on the atomic theory and here was the hope to find a good and intuitive way to introduce entropy. Well, it actually did not happen.

In order to explain you this, let us consider a simple experiment. We bring a glass of hot water in the room and leave it there. Eventually the temperature of the water will be equal to the ambient temperature. In classical thermodynamics, this process is considered as irreversible, that is, the Second Law forbids that the temperature in the glass will be hot again spontaneously. It is in complete agreement with our experience, so one would expect the same from statistical mechanics. However there the entropy has some statistical meaning and there is a nonzero chance that the water will be hot again. Moreover, there is a theorem (Poincaré recurrence) that states that if we wait long enough then the temperature of the glass must be hot again. No doubt, the chances are very small and time to wait is very long, in a way this is negligible. Some people are happy with such statistical explanation, some not.

Therefore the goal of Edwin T. Jaynes was to bring a new explanation of above. The author uses the formal equivalence between the Shannon’s and thermodynamic entropies and based on this suggests the entropy inference or subjective statistical thermodynamics. I should say that I enjoyed the part I. Here the assumption is that as we do not have complete information about the system, we use what is available and then just maximize the entropy to get the most plausible description. In a way it is some data fitting problem and it seems to be similar to the maximum likelihood in statistics. Yet, the term information here is more like available experimental data about the system and the entropy is not the same as in classical thermodynamics.

Unfortunately I was not able to follow the logic in the second paper. I will make just a couple of citations.

“With such an interpretation the expression “irreversible process” represents a semantic confusion; it is not the physical process that is irreversible, but rather our ability to follow it. The second law of thermodynamics then becomes merely the statement that although our information as to the state of a system may be lost in a variety of ways, the only way in which it can be gained is by carrying out further measurements.”

“It is important to realize that the tendency of entropy to increase is not a consequence of the laws of physics as such, … . An entropy increase may occur unavoidably, due to our incomplete knowledge of the forces acting on a system, or it may be entirely voluntary act on our part.”

This is somewhat similar to what Christoph Adami says (see Introduction). What I am going to do next is to be back to chemical thermodynamics.

Information and Chemical Thermodynamics

Let us assume that the entropy is subjective and that it describes the information. If this is true, then this must be also applied in chemical thermodynamics. At this point I will cite a couple of paragraphs from the Preface to the JANAF Tables:

Beginning in the mid-1950s, when elements other than the conventional carbon, hydrogen, oxygen, nitrogen, chlorine, and fluorine came into consideration as rocket propellant ingredients, formidable difficulties were encountered in conducting rigorous theoretical performance calculations for these new propellants. The first major problem was the calculational technique. The second was the lack of accurate thermodynamic data.

By the end of 1959, the calculation technique problem had been substantially resolved by applying the method of minimization of free energy to large, high speed digital computers. At this point the calculations become as accurate as the thermodynamic data upon which they were based. However, serious gaps were present in the available data: For propellant ingredients, only the standard heat of formation is required to conduct a performance calculation. For combustion products, the enthalpy and entropy must be known, as a function of temperature, in addition to the standard heat of formation.

One could imagine that Edwin T. Jaynes knew nothing about this, as it his times this could be even classified. Well, chemical thermodynamics was already a developed science in 1950s, the paragraphs above concern just a particular application of chemical thermodynamics. In any case, it hard to understand why Christoph Adami does not know how chemists employ entropy in their work.

Thus, what is subjective in the JANAF Tables? What the entropy in the JANAF Tables has to do with the information? I have found no answers to these questions so far.

Another point. People attribute information and subjectivity to the entropy. At the same time they do not see any problem with the energy. In the JANAF Tables there is a column for the entropy as well as for the enthalpy (H = U + pV). The latter could be safely considered as energy. How people obtain the entropy in the JANAF Tables? The answer is that they measure the heat capacity and then take the integral at constant pressure of one atmosphere

S_T = Integral_from_0_to_T Cp/T dT

If there are phase transitions then it is necessary to add Del H_ph_tr/T_ph_tr. At the same time the change in the enthalpy

H_T - H_0 = Integral_from_0_to_T Cp dT

Here there is another question. What is the difference between Integral Cp/T dT and Integral Cp dT? Why the first integral has something to do with information and the second not? Why the first integral has something to do with subjectivity and the second not?

Conclusion

I have tried to show that subjectivity and information do not belong to chemical thermodynamics. In my view, this means that the thermodynamic entropy has nothing to do with the informational entropy. Let me repeat my logic once more.

Chemical thermodynamics makes extensive use of the entropy. The entropy is tabulated in thermodynamic tables and then among other thermodynamic properties it is used to compute equilibrium composition or a complete phase diagram (see for example CALPHAD). I would expect that people stating that the informational entropy is the same as the thermodynamic entropy must show how such a statement is working in chemical thermodynamics. Personally I see no way to apply the information entropy and subjectivity in chemical thermodynamics. To this end, there is a good video from MIT, Teaching the Second Law, where a panel of scientist discuss how the Second Law should be taught. The video shows that the entropy is a difficult concept indeed. The scientists do not agree with each other on how to teach the Second Law. Yet, the concept of the entropy as information is not there at all.

I will conclude by yet another question. Let us consider two statements

  • The thermodynamic and information entropies are equivalent.
  • The thermodynamic and information entropies are completely different.

What the difference it makes in the artificial life research? So far I have not seen a good answer, why it is so important in artificial life to state that the information entropy has something to do with the entropy in chemical thermodynamics.

http://groups.google.com/group/embryophysics/t/a14b0a6b9294cf3


Posted

in

,

by

Tags: