Information: a basic physical quantity or rather emergence/supervenience phenomenon

My message to the everything list

On 14.01.2012 08:21 John Clark said the following:
> On Thu, Jan 12, 2012  Craig Weinberg<whatsonster@gmail.com>  wrote:

> For heavens sake, I went into quite a lot of detail about how the
> code is executed so that protein gets made, and it could not be more
> clear that the cell factory contains digital machines.
>
>> They are not information.
>>
>
> According to you nothing is information and that is one reason it is
> becoming increasingly difficult to take anything you say seriously.

I should say that I also have difficulty with the term information. A question would for example if information belongs to physics or not. Some physicists say that information is related to the entropy and as such it is a basic physical quantity. I personally do not buy it, as thermodynamics, as it has been designed, had nothing to do with information and information as such brings nothing to help to solve thermodynamics problem (more to this end in [1]).

Let us consider for example a conventional thermodynamic problem: improving efficiency of a motor. Is the information concept is helpful to solve this problem? If we look at modern motors, then we see that nowadays they are working together with controllers that allows us to drive the efficiency to the thermodynamic limit. The term information is helpful indeed to develop a controller but what about the thermodynamic limit of a motor? Does information helps here? In my view, not.

In the Gray’s book on consciousness ( Consciousness: Creeping up on the Hard Problem) there is an interesting statement on if physics is enough to explain biology. Gray’s answer is yes provided we add cybernetics laws and evolution. Let me leave evolution aside and discuss the cybernetics laws only as this is exactly where, I think, information comes into play. A good short video from the Artificial Intelligence Class that I have recently attended would be a good introduction (an intelligent agent sensing external information and then acting):


http://www.youtube.com/watch?v=cx3lV07w-XE

Thus, the question would be about the relationship between physics and cybernetics laws. When we consider the Equation of Everything, are the cybernetics laws already there or we still need to introduce them separately? One of possible answers would be that the cybernetics laws emerge or supervene on the physics laws. I however does not understand what this means. It probably has something to do with a transition between quantity and quality, but I do not understand how it happens either. For myself, it remains a magic.

Let me repeat a series from physical objects discussed already recently (see also [2][3]):

1) A rock;
2) A ballcock in the toilet;
3) A self-driving car;
4) A living cell.

Where do we have the cybernetics laws (information) and where not? Can physics describe these objects without the cybernetics laws? What emergence and superveniece mean along this series? Any idea?

Evgenii

[1] http://blog.rudnyi.ru/2010/12/entropy-and-artificial-life.html
[2] http://blog.rudnyi.ru/2011/01/perception-feedback-and-qualia.html
[3] http://blog.rudnyi.ru/2011/02/rock-and-information.html

Discussion

http://groups.google.com/group/everything-list/t/a4b4e1546e0d03df

Selected quotes from the discussion are below.

18.01.2012 18:47 John Clark: ”

>”Some physicists say that information is related to the entropy”

That is incorrect, ALL physicists say that information is related to entropy. There are quite a number of definitions of entropy, one I like, although not as rigorous as some it does convey the basic idea: entropy is a measure of the number of ways the microscopic structure of something can be changed without changing the macroscopic properties. Thus, the living human body has very low entropy because there are relatively few changes that could be made in it without a drastic change in macroscopic properties, like being dead; a bucket of water has a much higher entropy because there are lots of ways you could change the microscopic position of all those water molecules and it would still look like a bucket of water; cool the water and form ice and you have less entropy because the molecules line up into a orderly lattice so there are fewer changes you could make. The ultimate in high entropy objects is a Black Hole because whatever is inside one on the outside any Black Hole can be completely described with just 3 numbers, its mass, spin and electrical charge. ”

18.01.2012 20:13 Evgenii Rudnyi: “If you look around you may still find species of scientists who still are working with classical thermodynamics (search for example for CALPHAD). Well, if you refer to them as physicists or not, it is your choice. Anyway in experimental thermodynamics people determine entropies, for example from CODATA tables

http://www.codata.org/resources/databases/key1.html

S ° (298.15 K)
J K-1 mol-1

Ag  cr  42.55 ± 0.20
Al  cr  28.30 ± 0.10

Do you mean that 1 mole of Ag has more information than 1 mole of Al at 298.15 K?

Also remember that at constant volume dS = (Cv/T) dT and dU = CvdT. If the entropy is information then its derivative must be related to information as well. Hence Cv must be related to information. This however means that the energy also somehow related to information.

Finally, the entropy is defined by the Second Law and the best would be to stick to this definition. Only in this case, it is possible to understand what we are talking about.”

18.01.2012 23:42 Russell Standish: “Evgenii, while you may be right that some physicists (mostly experimentalists) work in thermodynamics without recourse to the notion of information, and chemists even more so, it is also true that the modern theoretical understanding of entropy (and indeed thermodynamics) is information-based.

This trend really became mainstream with Landauer’s work demonstrating thermodynamic limits of information processing in the 1960s, which turned earlier speculations by the likes of Schroedinger and Brillouin into something that couldn’t be ignored, even by experimentalists.

This trend of an information basis to physics has only accelerated in my professional lifetime – I’ve seen people like Hawking discuss information processing of black holes, and we’ve see concepts like the Beckenstein bound linking geometry of space to information capacity. David Deutsch is surely backing a winning horse to point out that algorithmic information theory must be a foundational strand of the “fabric of reality”. ”

19.01.2012 20:03 Evgenii Rudnyi: “I know that many physicists identify the entropy with information. Recently I had a nice discussion on biotaconv and people pointed out that presumably Edwin T. Jaynes was the first to make such a connection (Information theory and statistical mechanics, 1957). Google Scholar shows that his paper has been cited more than 5000 times, that is impressive and it shows indeed that this is in a way mainstream.

I have studied Jaynes papers but I have been stacked with for example

“With such an interpretation the expression “irreversible process” represents a semantic confusion; it is not the physical process that is irreversible, but rather our ability to follow it. The second law of thermodynamics then becomes merely the statement that although our information as to the state of a system may be lost in a variety of ways, the only way in which it can be gained is by carrying out further measurements.”

“It is important to realize that the tendency of entropy to increase is not a consequence of the laws of physics as such, … . An entropy increase may occur unavoidably, due to our incomplete knowledge of the forces acting on a system, or it may be entirely voluntary act on our part.”

This is above of my understanding. As I have mentioned, I do not buy it, I still consider the entropy as it has been defined by for example Gibbs.

Basically I do not understand what the term information then brings. One can certainly state that information is the same as the entropy (we are free with definitions after all). Yet I miss the meaning of that. Let me put it this way, we have the thermodynamic entropy and then the informational entropy as defined by Shannon. The first used to designe a motor and the second to design a controller. Now let us suppose that these two entropies are the same. What this changes in a design of a motor and a controller? In my view nothing.

By the way, have you seen the answer to my question:

>> Also remember that at constant volume dS = (Cv/T) dT and dU =
>> CvdT. If the entropy is information then its derivative must be
>> related to information as well. Hence Cv must be related to
>> information. This however means that the energy also somehow
>> related to information.

If the entropy is the same as information, than through the derivatives all thermodynamic properties are related to information as well. I am not sure if this makes sense in respect for example to design a self-driving car.

I am aware of works that estimated the thermodynamic limit (kT) to process information. I do not see however, how this proves the equivalence of information and entropy.

Evgenii

P.S. For a long time, people have identified the entropy with chaos. I have recently read a nice book to this end, Entropy and Art: An Essay on Disorder and Order by Arnheim, 1971, it is really nice.

One quote:

“The absurd consequences of neglecting structure but using the concept of order just the same are evident if one examines the present terminology of information theory. Here order is described as the carrier of information, because information is defined as the opposite of entropy, and entropy is a measure of disorder. To transmit information means to induce order. This sounds reasonable enough. Next, since entropy grows with the probability of a state of affairs, information does the opposite: it increases with its improbability. The less likely an event is to happen, the more information does its occurrence represent. This again seems reasonable. Now what sort of sequence of events will be least predictable and therefore carry a maximum of information? Obviously a totally disordered one, since when we are confronted with chaos we can never predict what will happen next. The conclusion is that total disorder provides a maximum of information; and since information is measured by order, a maximum of order is conveyed by a maximum of disorder. Obviously, this is a Babylonian muddle. Somebody or something has confounded our language.”

20.01.2012 05:59 Russell Standish: “Because I tend to think of “negentropy”, which is really another term for information, I tend to give priority to Schroedinger who wrote about the topic in the early 40s. But Jaynes was certainly instrumental in establishing the information based foundations to statistical physics, even before information was properly defined (it wasn’t really until the likes of Kolmogorov, Chaitin and Solomonoff in the 60s that information was really understood.

But Landauer in the late 60s was probably the first to make physicists really wake up to the concept of physical information.

But then, I’m not a science historian, so what would I know.

I can well recommend Denbigh & Denbigh’s book from the 80s – its a bit more of a modern understanding of the topic than Jaynes.”

Entropy in Relation to Incomplete Knowledge

21.01.2012 13:25 Evgenii Rudnyi: “Thanks. On biotaconv they have recommended John Avery’s Information Theory and Evolution, but I think I have already satisfied my curiosity with Jaynes’s two papers. My personal feeling is as follows:

1) The concept of information is useless in conventional thermodynamic problems. Let us take for example the Fe-C phase diagram
Fe-C phase diagram

What information has to do with the entropies of the phases in this phase diagram? Do you mean that I find an answer in Denbigh’s book?

2) If physicists say that information is the entropy, they must take it literally and then apply experimental thermodynamics to measure information. This however seems not to happen.

3) I am working with engineers developing mechatronics products. Thermodynamics (hence the entropy) is there as well as information. However, I have not met a practitioner yet who makes a connection between the entropy and information.”

22.01.2012 10:04 Evgenii Rudnyi: “To be concrete. This is for example a paper from control

J.C. Willems and H.L. Trentelman
H_inf control in a behavioral context: The full information case
IEEE Transactions on Automatic Control
Volume 44, pages 521-536, 1999
http://homes.esat.kuleuven.be/~jwillems/Articles/JournalArticles/1999.4.pdf

The term information is there but the entropy not. Could you please explain why? Or alternatively could you please point out to papers where engineers use the concept of the equivalence between the entropy and information?”

22.01.2012 19:16 Evgenii Rudnyi: “I have read your paper

http://arxiv.org/abs/nlin/0101006

It is well written. Could you please apply the principles from your paper to a problem on how to determine information in a book (for example let us take your book Theory of Nothing)?

Also do you believe earnestly that this information is equal to the thermodynamic entropy of the book? If yes, can one determine the information in the book just by means of experimental thermodynamics?”

25.01.2012 20:47 Evgenii Rudnyi: “Let me suggest a very simple case to understand better what you are saying. Let us consider a string “10” for simplicity. Let us consider the next cases. I will cite first the thermodynamic properties of Ag and Al from CODATA tables (we will need them)

S ° (298.15 K)
J K-1 mol-1

Ag  cr  42.55 ± 0.20
Al  cr  28.30 ± 0.10

In J K-1 cm-3 it will be

Ag  cr  42.55/107.87*10.49 = 4.14
Al  cr  28.30/26.98*2.7 = 2.83

1) An abstract string “10” as the abstract book above.

2) Let us make now an aluminum plate (a page) with “10” hammered on it (as on a coin) of the total volume 10 cm^3. The thermodynamic entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with “10” hammered on it (as on a coin) of the total volume 10 cm^3. The thermodynamic entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all dimensions from 2) to the total volume of 100 cm^3. Then the thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a string “10” and the thermodynamic entropy is different. If we take the statement literally then the information must be different in all four cases and defined uniquely as the thermodynamic entropy is already there. Yet in my view this makes little sense.

Could you please comment on this four cases?”

25.01.2012 21:25 Brent: “The thermodynamic entropy is a measure of the information required to locate the possible states of the plates in the phase space of atomic configurations constituting them. Note that the thermodynamic entropy you quote is really the *change* in entropy per degree at the given temperature. It’s a measure of how much more phase space becomes available to the atomic states when the internal energy is increased. More available phase space means more uncertainty of the exact actual state and hence more information entropy. This information is enormous compared to the “01” stamped on the plate, the shape of the plate or any other aspects that we would normally use to convey information. It would only be in case we cooled the plate to near absolute zero and then tried to encode information in its microscopic vibrational states that the thermodynamic and the encoded information entropy would become similar. ”

27.01.2012 20:21 Evgenii Rudnyi: “I would say that from your answer it follows that engineering information has nothing to do with the thermodynamic entropy. Don’t you agree?

It would certainly interesting to consider what happens when we decrease the temperature (in the limit to zero Kelvin). According to the Third Law the entropy will be zero then. What do you think, can we save less information on a copper plate at low temperatures as compared with higher temperatures? Or more?”

27.01.2012 21:43 Evgenii Rudnyi: “I am just trying to understand the meaning of the term information that you use. I would say that there is the thermodynamic entropy and then the Shannon information entropy. The Shannon has developed a theory to help engineers to deal with communication (I believe that you have also recently a similar statement). Yet, in my view when we talk about communication devices and mechatronics, the information that engineers are interested in has nothing to do with the thermodynamic entropy. Do you agree or disagree with that? If you disagree, could you please give an example from engineering where engineers do employ the thermodynamic entropy as the estimate of information. My example would be Millipede

http://en.wikipedia.org/wiki/Millipede_memory

I am pretty sure that when IBM engineers develop it, they do not employ the thermodynamic entropy to estimate its information capabilities. Also, the increase of temperature would be destroy saved information there.

Well, I might be deliberately obtuse indeed. Yet with the only goal to reach a clear definition of what the information is. Right now I would say that there is information in engineering and in physics and they are different. The first I roughly understand and the second not.”

28.01.2012 12:05 Evgenii Rudnyi: “If engineers would take the statement “the maximum possible value for information increases with temperature” literally, they should operate a hard disk at higher temperatures (the higher the better according to such a statement). Yet this does not happens. Do you know why?

In general we are surrounded devices that store information (hard discs, memory sticks, DVD, etc.). The information that these devices can store, I believe, is known with accuracy to one bit. Can you suggest a thermodynamic state which entropy gives us exactly that amount of information?

Here would be again a question about temperature. If I operate my memory stick in some reasonable range of temperatures, the information it contains does not change. Yet, the entropy in my view changes.”

 29.01.2012 16:23 Evgenii Rudnyi: “A good suggestion. It well might be that I express my thoughts unclear, sorry for that. Yet, I think that my examples show that

1) There is information that engineers employ.

2) There is the thermodynamic entropy.

3) Numerical values in 1) and 2) are not related to each other.

Otherwise I would appreciate if you express the relationship between information that engineers use and the thermodynamic entropy in your own words, as this is the question that I would like to understand.

I understand you when you say about the number of microstates. I do not understand though how they are related to the information employed by engineers. I would be glad to hear your comment on that. ”

29.01.2012 16:30 Evgenii Rudnyi: “The problem that I see is that the entropy changes when the temperature changes. Or do you claim that the entropy of the memory stick/DVD/hard disc remains the same when its temperature changes for example from 15 to 25 degrees?

Anyway, I do not see how one can obtain the information capacity of the storage devices from the thermodynamic entropy and this is my point.

Do you claim, that the information capacity for which we pay money of a memory stick/DVD/hard disk is equivalent to the thermodynamic entropy of the device?”

01.02.2012 21:10 Evgenii Rudnyi: “First the thermodynamic entropy is not context depended. This must mean that if it is the same as information, then the latter must not be context dependent as well. Could you please give me an example of a physical property that is context dependent?

Second, when I have different numerical values, this could mean that the units are different. Yet, if this is not the case, then in my view we are talking about two different entities.

Could you please explain then what is common between 1) and 2)?”

 01.02.2012 21:17 Evgenii Rudnyi: “I believe that you have mentioned once that information is negentropy. If yes, could you please comment on that? What negentropy would mean?

In general, I do not understand what does it mean that information at zero Kelvin is zero. Let us take a coin and cool it down. Do you mean that the text on the coin will disappear? Or you mean that no one device can read this text at zero Kelvin?”

 02.02.2012 19:32 Evgenii Rudnyi: “Could you please expand and show what do you mean by context dependent. Often people employ the same words, but the meaning is completely different (as it happens, in my view, with the entropy in thermodynamics and in the information theory).

When Russell says that information is context dependent, we talk about for example a DVD. Then information capacity as defined by the company and the number of physical states are completely different. Hence the notation from Russell that information is context dependent.

Do you mean that mass in context dependent in the same sense as above? If yes, could you please explain it a bit more?”

 02.02.2012 19:45 Evgenii Rudnyi: “The Boltzmann’s constant, as far as I understand, is defined uniquely. If you talk about some other universe (or Platonia) where one could imagine something else, then it could be. Yet, in the world that we know according to empirical scientific studies, the Boltmann’s constant is a fundamental constant. Hence I do not understand you in this respect.

Indeed, temperature is not available directly at the level of particles obeying classical or quantum laws. However for example it could be not a problem with the temperature but rather with the description at the particle level.

Anyway, I would suggest to stick to empirical scientific knowledge that we have. Then I do not understand what do you mean that temperature is context dependent either.

We can imagine very different worlds indeed. Yet, right now we discuss the question (I will repeat from the email to John) as follows:

When Russell says that information is context dependent, we talk about for example a DVD. Then information capacity as defined by the company and the number of physical states are completely different. Hence the notation from Russell that information is context dependent.

If you mean that the temperature and the Boltzmann constant are context depended in the same way, could you please give practical examples?”

 02.02.2012 20:23 Evgenii Rudnyi: “Yes, I agree with this, but I think it changes nothing with the term information. We have a number of physical states in a carrier (that is influenced indeed by for example the arrangement of ink on the page) and we have the information capability as defined by the company that sells the carrier.

By the way, the example with the zero temperature (or strictly speaking with temperature going to zero Kelvin) seems to show that the information capability could be even more than the number of physical states.”

 03.02.2012 20:50 Evgenii Rudnyi: “I guess that you have never done a lab in experimental thermodynamics. There are classical experiment where people measure heat of combustion, heat capacity, equilibrium pressure, equilibrium constants and then determine the entropy. If you do it, you see that you can measure the entropy the same way as other properties, there is no difference. A good example to this end is JANAF Thermochemical Tables (Joint Army-Naval-Air Force Thermochemical Tables). You will find a pdf here

http://www.nist.gov/data/PDFfiles/jpcrdM9.pdf

It is about 230 Mb but I guess it is doable to download it. Please open it and explain what is the difference between the tabulated entropy and other properties there. How your personal viewpoint on a thermodynamic system will influence numerical values of the entropy tabulated in JANAF? What is the difference with the mass or length? I do not see it.

You see, the JANAF Tables has started by military. They needed it to compute for example the combustion process in rockets and they have been successful. What part then in a rocket is context dependent?

This is the main problem with the books on entropy and information. They do not consider thermodynamic tables, they do not work out simple thermodynamic examples. For example let us consider the next problem:

———————————————–
Problem. Given temperature, pressure, and initial number of moles of NH3, N2 and H2, compute the equilibrium composition.

To solve the problem one should find thermodynamic properties of NH3, N2 and H2 for example in the JANAF Tables and then compute the equilibrium constant.

From thermodynamics tables (all values are molar values for the
standard pressure 1 bar, I have omitted the symbol o for simplicity but
it is very important not to forget it):

Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)

2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) – 2 Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(H2) – 2 S_298(NH3)

Del_Cp_r = Cp(N2) + 3 Cp(H2) – 2 Cp(NH3)

To make life simple, I will assume below that Del_Cp_r = 0, but it is
not a big deal to extend the equations to include heat capacities as well.

Del_G_r_T = Del_H_r_298 – T Del_S_r_298

Del_G_r_T = – R T ln Kp

When Kp, total pressure and the initial number of moles are given, it is rather straightforward to compute equilibrium composition. If you need help, please just let me know.
———————————————–

So, the entropy is there. What is context dependent here? Where is the difference with mass and length?”

06.02.2012 20:20 Evgenii Rudnyi:

I do not get your point. Do you mean that sometimes the surface effects could be important? Every thermodynamicist know this. However I do not understand your problem. The thermodynamics of surface phenomena is well established and to work with it you need to extend the JANAF Tables with other tables. What is the problem?

It would be good if you define better what do you mean by context dependent. As far as I remember, you have used this term in respect to informational capacity of some modern information carrier and its number of physical states. I would suggest to stay with this example as the definition of context dependent. Otherwise, it does not make much sense.

The entropy is well-defined for a nonequilibrium system as soon as one can use local temperature. There are some rare occasions where local temperature is ambiguous, for example in plasma where one defines different temperatures for electrons and molecules. Yet, the two temperatures being defined, the entropy becomes again well-defined.

We are again at the definition of context dependent. What are saying now is that when you have new physical effects, it is necessary to take them into account. What it has to do with your example when information on an information carrier was context dependent?

 08.02.2012 20:32 Evgenii Rudnyi:

As I have mentioned, I would like to understand what you mean. In order to achieve this, I suggest to consider simple problems to apply your theory. I think it is the best to understand a theory by means of simple practical applications. Why do you consider this as a chip rhetorical trick?

What I observe personally is that there is information in informatics and information in physics (if we say that the thermodynamic entropy is the information). If you would agree, that these two informations are different, it would be fine with me, I am flexible with definitions.

Yet, if I understand you correctly you mean that the information in informatics and the thermodynamic entropy are the same. This puzzles me as I believe that the same physical values should have the same numerical values. Hence my wish to understand what you mean. Unfortunately you do not want to disclose it, you do not want to apply your theory to examples that I present.


Comments are closed.