In the modern scientific culture, the statement that the entropy and information are the same things is ubiquitous. For example the famous paper of Edwin T. Jaynes  according to Google Scholar  has been cited more than 5000 times. The list of books from Prof Gordon  about the deep relationship between entropy and information in the biology is about 30 titles and I am pretty sure that one can find even more book along this line.
On the other hand, if I consider my personal area of expertise: thermodynamics and experimental thermodynamics, information as such is not there. A simple example. Below is the Fe-C phase diagram as computed according to the CALPHAD approach (see for example ). The thermodynamic entropy has been employed, the information concept not.
Hence we have an interesting situation. Everybody is convinced that the entropy is information but when we look at the thermodynamic research, the information is not there. How it could happen? As usual, a shoemaker has no shoes?
I will start with a short historical overview in order to show how the intimate relationship between the entropy and information has been developed. Next I will briefly overview experimental thermodynamics, as in my view it is important to understand that when we talk about the thermodynamic entropy, it has been measured and tabulated the same way as other thermodynamic values. After that, I will list my examples to clarify the relationship between thermodynamic entropy and information entropy and finally I present my personal opinion on this issue.
Historical Perspective on Entropy and Information
The thermodynamics and entropy have been developed in order to describe heat engines. After that chemists have found many creative uses of thermodynamics and entropy in chemistry. Most often chemists employ thermodynamics in order to compute equilibrium composition, what should happen at the end when some species are mixed with each other, see for example figure with the phase diagram above.
The development of classical thermodynamics was not an easy born (see for example Truesdell ) and many people find classical thermodynamics difficult to understand. No doubt, the Second Law here is the reason; people find it at least as non intuitive. Please note that the classical thermodynamics is a phenomenological science and it does not even require an assumption of the atomic hypothesis. Hence when the existence of atoms has been proved, statistical thermodynamics based on the atomic theory has been developed and there was the hope to find a good and intuitive way to introduce entropy. Unfortunately, it did not happen.
In order to explain you this, let us consider a simple experiment. We bring a glass of hot water in the room and leave it there. Eventually the temperature of the water will be equal to the ambient temperature. In classical thermodynamics, this process is considered as irreversible, that is, the Second Law forbids that the temperature in the glass will be hot again spontaneously. It is in complete agreement with our experience, so one would expect the same from statistical mechanics. However there the entropy has statistical meaning and there is a nonzero chance that the water will be hot again. Moreover, there is a theorem (Poincaré recurrence) that states that if we wait long enough then the temperature of the glass must be hot again. No doubt, the chances are very small and time to wait is very long, in a way this is negligible. Some people are happy with such statistical explanation, some not.
In any case, statistical mechanics has changed nothing in practical applications of thermodynamics, rather it helps to derive missing information from atomic properties. We will not find information as such in the classical works of Boltzmann and Gibbs on statistical thermodynamics, that means that statistical thermodynamics without information has existed for quite awhile.
Shannon has introduced the information entropy in his famous paper  where he writes
“The form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics where pi is the probability of a system being in cell i of its phase space. H is then, for example, the H in Boltzmann s famous H theorem.”
Please note that Shannon has just shown that the equation employed to the problems of information transfer is similar to that in statistical thermodynamics. He has not made a statement about the meaning of such a similarity, that is, he did not identify his entropy as the thermodynamic entropy. He has just used the same term, nothing more. Yet, some confusion was already there as since then we have a similar equations and two similar terms, the thermodynamic entropy and information entropy. In any case, such a state with a similar equation describing two different phenomena took again awhile.
Edwin T. Jaynes has made the final step in 
p. 622 after eq (2-3) (this is the Shannon equation) “Since this is just the expression for entropy as found in statistical mechanics, it will be called the entropy of the probability distribution p_i; henceforth we will consider the terms “entropy” and “uncertainty” as synonymous.”
This is exactly the logic that brought the deep relationship between Shannon’s information and the thermodynamic entropy. As the two equations are the same, they should describe the same phenomenon, that is, information is the thermodynamic entropy.
Russell Standish  has listed other authors contributed to binding the thermodynamic entropy and information:
“Because I tend to think of “negentropy”, which is really another term for information, I tend to give priority to Schroedinger who wrote about the topic in the early 40s. But Jaynes was certainly instrumental in establishing the information based foundations to statistical physics, even before information was properly defined (it wasn’t really until the likes of Kolmogorov, Chaitin and Solomonoff in the 60s that information was really understood).
But Landauer in the late 60s was probably the first to make physicists really wake up to the concept of physical information.”
Anyway in all discussions, I have seen a single logic: if the equation for the entropy in statistical mechanics is the same as for the information entropy in Shannon’s paper, then the entropy is information. I will give my opinion on this later in the section discussion. Right now it is enough to say that at present we have three terms, information in IT, information in physics and the thermodynamic entropy. Some people consider these three terms as synonyms, and some not.
I have already mentioned that thermodynamics is employed extensively to solve practical problems in engineering and chemistry. Let me quote a couple of paragraphs from the Preface to the JANAF Tables  (ca. 230 Mb):
“Beginning in the mid-1950s, when elements other than the conventional carbon, hydrogen, oxygen, nitrogen, chlorine, and fluorine came into consideration as rocket propellant ingredients, formidable difficulties were encountered in conducting rigorous theoretical performance calculations for these new propellants. The first major problem was the calculational technique. The second was the lack of accurate thermodynamic data.”
“By the end of 1959, the calculation technique problem had been substantially resolved by applying the method of minimization of free energy to large, high speed digital computers. At this point the calculations become as accurate as the thermodynamic data upon which they were based. However, serious gaps were present in the available data: For propellant ingredients, only the standard heat of formation is required to conduct a performance calculation. For combustion products, the enthalpy and entropy must be known, as a function of temperature, in addition to the standard heat of formation.“
In order to solve the second problem there was extensive development in experimental thermodynamics and the results are presented in thermodynamics tables, most famous being the JANAF Thermochemical Tables (Joint Army-Naval-Air Force Thermochemical Tables)  (ca. 230 Mb).
As the name says, the JANAF Tables have been originally developed for military. I guess that the very first edition was classified. Yet, there are many peaceful applications as well and chemists all over the world use these tables nowadays to predict equilibrium composition of a system in question. Among other properties, the JANAF Tables contains the entropy. I believe that this is a very good starting point for everybody that would like to talk about the entropy, just to take the JANAF Tables and see that chemists have successfully measured the entropy for a lot of compounds. As the JANAF Tables are pretty big (almost 2000 pages), a simpler starting point is the CODATA Tables .
In essence, the entropy in chemical thermodynamics is a quantitative property that has been measured and tabulated for many substances. You may want to think it this way: chemists have been using thermodynamics and entropy for a long time to create reliable processes to obtain substances needed and they have been successful. I highly recommend you at this point to download the JANAF Tables and browse them. If we talk about the thermodynamic entropy, this is a pretty good starting point.
Practical Examples to Think Over
In my view, the best way to discuss a theory is to try to use it on simple practical examples. To this end, below you will find examples to the thermodynamic entropy, information entropy and the number of states in physics.
1. The Thermodynamic Entropy: What is information in these examples?
1.1) From CODATA  tables
S ° (298.15 K) J K-1 mol-1
Ag cr 42.55 ± 0.20
Al cr 28.30 ± 0.10
What these values tell us about information?
1.2) At constant volume dS = (Cv/T) dT and dU = CvdT. Does Cv is related to information? Does the internal energy is related to information?
1.3) In the JANAF Tables there is a column for the entropy as well as for the enthalpy (H = U + pV). The latter could be safely considered as energy. How people obtain the entropy in the JANAF Tables? The answer is that they measure the heat capacity and then take the integral at constant pressure of one atmosphere
S_T = Integral_from_0_to_T Cp/T dT
If there are phase transitions then it is necessary to add Del H_ph_tr/T_ph_tr. At the same time the change in the enthalpy
H_T – H_0 = Integral_from_0_to_T Cp dT
Here there is a question to think over. What is the difference between Integral Cp/T dT and Integral Cp dT? Should both have something to do with information or only the first one?
1.4) Problem. Given temperature, pressure, and initial number of moles of NH3, N2 and H2, compute the equilibrium composition. The thermodynamic entropy in this example is present. What is the meaning of information?
To solve the problem one should find thermodynamic properties of NH3, N2 and H2 for example in the JANAF Tables and then compute the equilibrium constant.
From thermodynamics tables (all values are molar values for the standard pressure 1 bar, I have omitted the symbol o for simplicity but it is very important not to forget it):
Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)
2NH3 = N2 + 3H2
Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) – 2 Del_f_H_298(NH3)
Del_S_r_298 = S_298(N2) + 3 S_298(H2) – 2 S_298(NH3)
Del_Cp_r = Cp(N2) + 3 Cp(H2) – 2 Cp(NH3)
To make life simple, I will assume below that Del_Cp_r = 0, but it is not a big deal to extend the equations to include heat capacities as well.
Del_G_r_T = Del_H_r_298 – T Del_S_r_298
Del_G_r_T = – R T ln Kp
When Kp, total pressure and the initial number of moles are given, it is rather straightforward to compute equilibrium composition. If you need help, please just let me know.
1.5) The phase diagram of Fe-C shown in Introduction. It has been computed from the Thermocalc database  that contains entropies found by experimental thermodynamics. What is the meaning of information?
2. The Information Entropy: What is the thermodynamic system in these examples?
2.1) The example of the entropy used by engineers in informatics has been given by Jason  and I will quote him below. Could you please tell me, the thermodynamic entropy of what is discussed in his example?
On 03.02.2012 00:14 Jason Resch said the following:
> Sure, I could give a few examples as this somewhat intersects with my
> line of work.
> The NIST 800-90 recommendation (
> http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf )
> for random number generators is a document for engineers implementing
> secure pseudo-random number generators. An example of where it is
> important is when considering entropy sources for seeding a random
> number generator. If you use something completely random, like a
> fair coin toss, each toss provides 1 bit of entropy. The formula is
> -log2(predictability). With a coin flip, you have at best a .5
> chance of correctly guessing it, and -log2(.5) = 1. If you used a
> die roll, then each die roll would provide -log2(1/6) = 2.58 bits of
> entropy. The ability to measure unpredictability is necessary to
> ensure, for example, that a cryptographic key is at least as
> difficult to predict the random inputs that went into generating it
> as it would be to brute force the key.
> In addition to security, entropy is also an important concept in the
> field of data compression. The amount of entropy in a given bit
> string represents the theoretical minimum number of bits it takes to
> represent the information. If 100 bits contain 100 bits of entropy,
> then there is no compression algorithm that can represent those 100
> bits with fewer than 100 bits. However, if a 100 bit string contains
> only 50 bits of entropy, you could compress it to 50 bits. For
> example, let’s say you had 100 coin flips from an unfair coin. This
> unfair coin comes up heads 90% of the time. Each flip represents
> -log2(.9) = 0.152 bits of entropy. Thus, a sequence of 100 coin
> flips with this biased coin could be represent with 16 bits. There
> is only 15.2 bits of information / entropy contained in that 100 bit
> long sequence.
2.2) This is a paper from control
J.C. Willems and H.L. Trentelman, H_inf control in a behavioral context: The full information case IEEE Transactions on Automatic Control, Volume 44, pages 521-536, 1999
The term information is there. What is the related thermodynamic entropy?
2.3) In my understanding, when we consider an algorithm, this is a pure IT construct, that does not depend whether I will implement it with an abacus or some Turing machine, with Intel or PowerPC processor. From this follows that the algorithm and hence its information entropy does not depend on temperature or pressure of a physical system that does the computation. In my view it makes sense.
Let us consider consciousness now. Our brains produces it and our brain has some thermodynamic entropy. If we assume that the same effect could be achieved with some robot, does it mean that the thermodynamic entropy of the robot must be the same as that of the brain?
2.4) Information as a representation on a physical object.
Let us consider a string “10″ for simplicity. Let us consider the next cases. I will cite first the thermodynamic properties of Ag and Al from CODATA tables (we will need them)
S ° (298.15 K) J K-1 mol-1
Ag cr 42.55 ± 0.20
Al cr 28.30 ± 0.10
In J K-1 cm-3 it will be
Ag cr 42.55/107.87*10.49 = 4.14
Al cr 28.30/26.98*2.7 = 2.83
A) An abstract string “10″.
B) Let us make now an aluminum plate (a page) with “10″ hammered on it (as on a coin) of the total volume 10 cm^3. The thermodynamic entropy is then 28.3 J/K.
C) Let us make now a silver plate (a page) with “10″ hammered on it (as on a coin) of the total volume 10 cm^3. The thermodynamic entropy is then 41.4 J/K.
D) We can easily make another aluminum plate (scaling all dimensions from 2) to the total volume of 100 cm^3. Then the thermodynamic entropy is 283 J/K.
Now we have four different combinations to represent a string “10″ and the thermodynamic entropy is different. Any comment?
3. Information in Physics: the number of states
The thermodynamic entropy could be considered a measure of number of states and one could say that this is the information in physics. I will quote Brent’s comment to problem 2.4 :
“The thermodynamic entropy is a measure of the information required to locate the possible states of the plates in the phase space of atomic configurations constituting them“.
This is formally correct but then the question is the relationship of a number of states with information in IT.
3.1) It would certainly interesting to consider what happens when we increase or decrease the temperature (in the limit to zero Kelvin, according to the Third Law the entropy will be zero at zero Kelvin). What do you think, can we save less information on a copper plate at low temperatures as compared with higher temperatures? Or more?
If engineers would take the statement “the maximum possible value for information increases with temperature” literally, they should operate a hard disk at higher temperatures (the higher the better according to such a statement). Yet this does not happens. Do you know why?
If I operate my memory stick in some reasonable range of temperatures, the information it contains does not change. Yet, the entropy in my view changes. Why it happens this way?
3.2) My example would be Millipede
I am pretty sure that when IBM engineers develop it, they do not employ the thermodynamic entropy to estimate its information capabilities. Also, the increase of temperature would be destroy saved information there.
3.3) In general we are surrounded devices that store information (hard discs, memory sticks, DVD, etc.). The information that these devices can store, I believe, is known with accuracy to one bit. Can you suggest a thermodynamic state which entropy gives us exactly that amount of information?
3.4) Let us consider a coin and let us imagine that the temperature is going to zero Kelvin. What happens with the text imprinted on the coin in this case?
The examples presented above betray my personal opinion. Yes, I believe that the thermodynamic entropy and information entropy do not relate to each other. Personally I find the reasoning clumsy. In my view, the same mathematical structure of equations does not say that the phenomena are related. For example the Poisson equation for electrostatics is mathematically equivalent to the stationary heat conduction equation. What does it mean? Well, this allows a creative way to solve an electrostatic problem for people who have a thermal FEM solver and do not have an electrostatic solver. They can solve an electrostatic problem by using a thermal FEM solver by means of mathematical analogy. This does happen but I doubt that we could state that the stationary heat conduction is equivalent to electrostatics.
p. 28(142) “First, all notions of entropy discussed in this essay, except the thermodynamic and the topological entropy, can be understood as variants of some information-theoretic notion of entropy.”
I understand it this way. When I am working with gas, liquid or solid at the level of experimental thermodynamics, the information according to the authors is not there (at this point I am in agreement with them). Yet, as soon as theoretical physicists start thinking about these objects, they happen to be fully filled with information.
Alternatively one could say that the viewpoint “information and the entropy are the same” brings us useful conjectures. However, no one told me exactly what useful conjectures follow. In discussion on the biotaconv , I have suggested to consider two statements
- The thermodynamic and information entropies are equivalent.
- The thermodynamic and information entropies are completely different.
and have asked what the difference it makes in the artificial life research. I still do not know the answer.
I would like to conclude by a quote from Arnheim . In my view, it nicely characterizes the current status of relationship between the entropy and information.
“The absurd consequences of neglecting structure but using the concept of order just the same are evident if one examines the present terminology of information theory. Here order is described as the carrier of information, because information is defined as the opposite of entropy, and entropy is a measure of disorder. To transmit information means to induce order. This sounds reasonable enough. Next, since entropy grows with the probability of a state of affairs, information does the opposite: it increases with its improbability. The less likely an event is to happen, the more information does its occurrence represent. This again seems reasonable. Now what sort of sequence of events will be least predictable and therefore carry a maximum of information? Obviously a totally disordered one, since when we are confronted with chaos we can never predict what will happen next. The conclusion is that total disorder provides a maximum of information; and since information is measured by order, a maximum of order is conveyed by a maximum of disorder. Obviously, this is a Babylonian muddle. Somebody or something has confounded our language.”
- E. T. Jaynes, Information theory and statistical mechanics, Phys. Rev. Part I: 106, 620–630 (1957), http://dx.doi.org/10.1103/PhysRev.106.620, Part II: 108, 171–190 (1957) http://dx.doi.org/10.1103/PhysRev.108.171
- Google Scholar, http://scholar.google.com/scholar?q=E.+T.+Jaynes
- Books on entropy and information, http://blog.rudnyi.ru/2012/02/books-on-entropy-and-information.html
- J-O Andersson, Thomas Helander,Lars Hdghmd, Pingfang Shi, Bo Sundman, THERMO-CALC & DICTRA, Computational Tools For Materials Science, Calphad, Vol. 26, No. 2, pp. 273-312, 2002, http://dx.doi.org/10.1016/S0364-5916(02)00037-8
- Entropy and Artificial Life, http://blog.rudnyi.ru/2010/12/entropy-and-artificial-life.html
- Entropy and Information, http://groups.google.com/group/embryophysics/t/a14b0a6b9294cf3
- Information: a basic physical quantity or rather emergence/supervenience phenomenon, http://blog.rudnyi.ru/2012/01/information.html
- Entropy: A Guide for the Perplexed, http://blog.rudnyi.ru/2012/02/entropy-a-guide-for-the-perplexed.html
- The thermodynamics of Computation, http://blog.rudnyi.ru/2012/02/the-thermodynamics-of-computation.html
- C. Truesdell, The Tragicomical History of Thermodynamics, 1822-1854 (Studies in the History of Mathematics and Physical Sciences), 1980.
- C. E. Shannon, (1948). A mathematical theory of communication [corrected version]. Bell System Technical Journal 27, 379–423, 623–656. http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html
- Russell Standish, http://groups.google.com/group/everything-list/msg/fc0272531101a9e6
- NIST-JANAF Thermochemical Tables, Fourth Edition, J. Phys. Chem. Ref. Data, Monograph No. 9, 1998, http://www.nist.gov/data/PDFfiles/jpcrdM9.pdf (ca. 230 Mb)
- J. D. Cox, D. D. Wagman, V. A. Medvedev, CODATA Key Values for Thermodynamics, Hemisphere Publishing Corp., New York, 1989. http://www.codata.org/resources/databases/key1.html
- Jason Resch, http://groups.google.com/group/everything-list/msg/c2da04dbd4ce2f8d
- Brent, http://groups.google.com/group/everything-list/msg/bd727f700d0d58c0
- Roman Frigg, Charlotte Werndl, Entropy: Guide for perplexed, in Claus Beisbart and Stephan Hartmann (eds.): Probability in Physics, Oxford University Press, 2011, 115-42, http://www.romanfrigg.org/writings/EntropyGuide.pdf
- Rudolf Arnheim, Entropy and Art: An Essay on Disorder and Order, 1971