Entropy: A Guide for the Perplexed

My messages to the everything list

http://groups.google.com/group/everything-list/t/d6932a890e1a670d

05.02.2012 17:16:

On 24.01.2012 22:56 meekerdb said the following:

 In thinking about how to answer this I came across an excellent paper
by Roman Frigg and Charlotte Werndl
http://www.romanfrigg.org/writings/EntropyGuide.pdf which explicates
the relation more comprehensively than I could and which also gives
some historical background and extensions: specifically look at
section 4.

Brent,

I have started reading the pdf. A few comments to section 2 Entropy in thermodynamics.

The authors seem to be sloppy.

1) p. 2 (116). “If we consider a cyclical process—a process in which the beginning and the end state are the same — a reversible process leaves the system and its surroundings unchanged.”

This is wrong, as one runs the Carnot cycle reversibly, then the heat will be converted to work (or vice versa) and there will be changes in the surroundings. They probably mean that if one runs the Carnot cycle reversibly twice, first in one direction and then in the opposite, then the surrounding will be unchanged.

2) p. 2(116). “We can then assign an absolute entropy value to every state of the system by choosing one particular state A (we can choose any state we please!) as the reference point.”

They misuse the conventional terminology. The absolute entropy is defined by the Third Law and they just want employ S instead of Del S. It is pretty dangerous, as when one changes the working body in the Carnot cycle, then such a notation will lead to a catastrophe.

3) p.3(117). “If we now restrict attention to adiathermal processes (i.e. ones in which temperature is constant),

According to Eq 4 that they discuss they mean an adiabatic process where temperature is not constant.

However, at the end of this small section they write

p. 3(117). “S_TD has no intuitive interpretation as a measure of disorder, disorganization, or randomness (as is often claimed). In fact such considerations have no place in TD.”

I completely agree with that, so I am going to read further.

05.02.2012 19:28:

I have browsed the paper. I should say that I am not impressed. The logic is exactly the same as in other papers and books.

I have nothing against the Shannon entropy (Section 3 in the paper). Shannon can use the term entropy (why not) but then we should just distinguish between the informational entropy and the thermodynamic entropy as they have been introduced for completely different problems.

The logic that both entropies are the same is in Section 4 and it is expressed bluntly as

p. 13 (127) “Then, if we regard the w_i as messages, S_B,m(M_i) is equivalent to the Shannon entropy up to the multiplicative constant nk and the additive constant C.”

p. 15 (129) “The most straightforward connection is between the Gibbs entropy and the continuous Shannon entropy, which differ only by the multiplicative constant k.”

Personally I find this clumsy. In my view, the same mathematical structure of equations does not say that the phenomena are related. For example the Poisson equation for electrostatics is mathematically equivalent to the stationary heat conduction equation. So what? Well, one creative use is for people who have a thermal FEM solver and do not have an electrostatic solver. They can solve an electrostatic problem by using a thermal FEM solver by means of mathematical analogy. This does happen but I doubt that we could state that the stationary heat conduction is equivalent to electrostatics.

The most funny it looks in the conclusion

p. 28(142) “First, all notions of entropy discussed in this essay, except the thermodynamic and the topological entropy, can be understood as variants of some information-theoretic notion of entropy.”

I understand it this way. When I am working with gas, liquid or solid at the level of experimental thermodynamics, the information according to the authors is not there (at this point I am in agreement with them). Yet, as soon as theoretical physicists start thinking about these objects, they happen to be fully filled with information.

06.02.2012 20:18:

There is some difference between the entropy and classical and statistical thermodynamics. I will copy my old text to describe it.

In order to explain you this, let us consider a simple experiment. We bring a glass of hot water in the room and leave it there. Eventually the temperature of the water will be equal to the ambient temperature. In classical thermodynamics, this process is considered as irreversible, that is, the Second Law forbids that the temperature in the glass will be hot again spontaneously. It is in complete agreement with our experience, so one would expect the same from statistical mechanics. However there the entropy has some statistical meaning and there is a nonzero chance that the water will be hot again. Moreover, there is a theorem (Poincaré recurrence) that states that if we wait long enough then the temperature of the glass must be hot again.

Otherwise, they are the same. This does not mean however that the information come into the play in the Boltzmann-Gibbs formulation. You have missed my comment to this, hence I will repeat it.

In my option, the similarity of mathematical equations does not mean that the phenomena are the same. Basically it is about definitions. If you define the information through the Shannon entropy, it is okay. You have however to prove that the Shannon entropy is the same as the thermodynamic entropy. In this respect, the similarity of equations, in my view, is a weak argument.

Do you have anything else to support that the thermodynamic entropy is information except that the two equations are similar with each other?

07.02.2012 20:42:

I would suggest to look at the history briefly.

Statistical thermodynamics has been derived by Boltzmann and Gibbs and at that time there was no information in it. This lasted for quite awhile and many famous physicists have not found any information in statistical mechanics.

The information entropy has started with Shannon’s work where he writes

“The form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics8 where pi is the probability of a system being in cell i of its phase space. H is then, for example, the H in Boltzmann s famous H theorem.”

Yet, he just shows that the equation is similar but he does not make a
statement about the meaning of such a similarity, that is, he does not identifies his entropy as the thermodynamic entropy. He just uses the term, nothing more. Now we have two similar equations describing two different phenomena and such a state took again a while.

Now let me quote from Edwin T. Jaynes first paper

p. 622 after eq (2-3) (this is the Shannon equation) “Since this is just the expression for entropy as found in statistical mechanics, it will be called the entropy of the probability distribution p_i; henceforth we will consider the terms “entropy” and “uncertainty” as synonymous.”

This is exactly the logic, that I have mentioned above and that is expressed in the paper you give me. As the two equations are the same, they describe the same phenomenon. In my view, this is clumsy and I have given example when the same mathematical equation describes two different physical phenomena.

If you talk about the same concept, let me ask you following. The only example of the entropy used by engineers in informatics has been given by Jason and I will quote him below. Could you please tell me, the thermodynamic entropy of what is discussed in his example?

On 03.02.2012 00:14 Jason Resch said the following:


> Evgenii,
>
> Sure, I could give a few examples as this somewhat intersects with my
> line of work.
>
> The NIST 800-90 recommendation (
> http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf )
> for random number generators is a document for engineers implementing
> secure pseudo-random number generators.  An example of where it is
> important is when considering entropy sources for seeding a random
> number generator.  If you use something completely random, like a
> fair coin toss, each toss provides 1 bit of entropy.  The formula is
> -log2(predictability).  With a coin flip, you have at best a .5
> chance of correctly guessing it, and -log2(.5) = 1.  If you used a
> die roll, then each die roll would provide -log2(1/6) = 2.58 bits of
> entropy.  The ability to measure unpredictability is necessary to
> ensure, for example, that a cryptographic key is at least as
> difficult to predict the random inputs that went into generating it
> as it would be to brute force the key.
>
> In addition to security, entropy is also an important concept in the
> field of data compression.  The amount of entropy in a given bit
> string represents the theoretical minimum number of bits it takes to
> represent the information.  If 100 bits contain 100 bits of entropy,
> then there is no compression algorithm that can represent those 100
> bits with fewer than 100 bits.  However, if a 100 bit string contains
> only 50 bits of entropy, you could compress it to 50 bits.  For
> example, let’s say you had 100 coin flips from an unfair coin.  This
> unfair coin comes up heads 90% of the time.  Each flip represents
> -log2(.9) = 0.152 bits of entropy.  Thus, a sequence of 100 coin
> flips with this biased coin could be represent with 16 bits.  There
> is only 15.2 bits of information / entropy contained in that 100 bit
> long sequence.
>
> Jason


Comments are closed.