# The Unavoidable Cost Of Computation Revealed

Some quotes (mostly mine) from the discussion on the everything list

13.03.2012 18:28 Evgenii Rudnyi:

Could you please give one example from physics (yet please not a thought experiment) where information allows us to reduce entropy?

13.03.2012 20:09 Brent:

http://www.nature.com/news/2010/101114/full/news.2010.606.html

25.03.2012 15:44  Evgenii Rudnyi:

I have looked the paper that you have linked.

Shoichi Toyabe, Takahiro Sagawa, Masahito Ueda, Eiro Muneyuki & Masaki Sano Experimental demonstration of information-to-energy conversion and validation of the generalized Jarzynski equality
Nature Physics, Volume: 6, Pages: 988–992 (2010)

I should say that I am not impressed. One can make a feedback mechanism indeed (by the way, it is quite common in engineering), but then in my view we should consider the whole system at once. What is the information then and what is its relationship with the entropy of the whole system?

By the way the information about the position of the bead have nothing to do with its entropy. This is exactly what happens in any feedback systems. One can introduce information, especially with digital control, but it has nothing to do with the thermodynamic entropy.

Then I like

In microscopic systems, thermodynamic quantities such as work, heat and internal energy do not remain constant“.

The authors seem to forget that work and heat are not state functions. How work and heat could remain constant even in a macroscopic systems?

I also find the assumption at the beginning of the paper

Note that, in the ideal case, energy to place the block can be negligible; this implies that the particle can obtain free energy without any direct energy injection.”

funny. After block is there, the particle will jump in the direction of the block and it will interact with the block. This interaction will force the particle to jump in the other direction and I would say the energy is there. The authors should have defined better what they mean by direct energy injection.

In essence, in my view the title “information-to-energy conversion” is some word game. It could work when instead of considering the whole system in question, one concentrates on a small subsystem. Say if I consider a thermostat then I could also say that information about the current temperature is transformed to the heater and thus to energy. I am not sure if this makes sense though.

27.03.2012 20:26  Evgenii Rudnyi:

I have nothing against of fundamental science and I do not expect practical application for this paper.

Yet, I do not see fundamental results. What is in the paper is just a change of vocabulary. I would say that we are free to choose a definition. Well, right now when free will is under question such a statement might be ill-posed but I guess that you understand what I mean.

Let me start with “extracting energy from random molecular motion”. Let us consider the next example. A macroscopic ball is flying in one direction. We suddenly make a potential barrier on its way and it flies back after the collision with this potential wall. Do we inject the energy in the system to change the ball trajectory or not? Could you please compare this example with the experiment described in the paper? What is the difference between the wall in the example and potential walls in the experiment?

My point above is that I am not convinced yet, that the energy in the experiment is extracted from random molecular motion. It might be possible to state this but then, in my view, some change in the normal vocabulary is needed. This has been taken in the paper for granted. Hence, I am not convinced.

Then

“What you asked for was an example of using information to reduce entropy: not obtaining information AND using it to reduce entropy.”

What do you mean here? I see two statements

“using information to reduce entropy”

and

“obtaining information AND using it to reduce entropy”

What in your view has been done in the paper and what difference do you see between these two statements.

Finally when I have quoted a statement from the paper

“In microscopic systems, thermodynamic quantities such as work, heat and internal energy do not remain constant”

I have meant the following. A thermodynamic system has an internal energy, the Gibbs energy, the entropy and other state functions. However a thermodynamic system does not possess work or heat, they are not state functions. A thermodynamic system can perform work or produce heat when it goes from one state to another. Hence the statement above as such is just sloppy.

28.03.2012 20:16  Evgenii Rudnyi:

Thank you for your answer. I have thought more and I believe that now I understand the paper better.

I would agree that an ideal potential barrier, provided it is created intelligently, does not inject the energy in a micro- and a macrosystem. Well, if a potential barrier has some thickness, then when we insert it, it should move the medium away. Also if we do not know the position of the bead exactly then it well might be that the wall will push the bead directly. Hence one cannot exclude that the potential wall inject the energy as well, but presumably one can neglect it.

Still, I do not understand exactly how to describe the influence of the wall on the physical system. In the ideal case, it does not change the energy of the system but it definitely changes the momentum in the case with the ball. In a microsystem, provided the wall goes through the medium only, the momentum could stay the same though as the change from both sides of the wall might cancel each other. It could be.

In any case, it is more interesting what happens with information. I also agree that in this case the information is processed by the controller, that is, there are some measurements, the results go into the controller, and after some processing it makes an action.

Thereafter in my view, the title of the paper is misleading: “information-to-energy conversion”. By the way the authors are talking about the energy, not the entropy.

What happens is that we have a multidomain system where there are different interactions between different subsystems. Using some very specific vocabulary one can presumably find a meaning in such a statement “information-to-energy conversion” (or if you want it ot “information-to-entropy conversion”). As I have already mentioned, this could work if we limit the analysis for one subsystem of the whole system. Yet, then information will be context dependent, so I am not sure if it will be possible to bring a strict definition of information as a property of a physical system from such an experiment.

Again, I have nothing against of the experiment as such. It looks interesting. What is missing is a good theoretical analysis when one starts from the whole system, including the controller (I guess, there are computations there) and write down all the assumptions made to come to the conclusion “information-to-energy conversion”. It would be nice to understand how information emerges from the movements of atoms and molecules in the whole system including the controller.

Posted

in

by

Tags: