Classical thermodynamics and entropy.

Information has an increasing influence in physics.  While in the 19th century, no concept of information was used in physics, this notion gets a more and more central part in the principles of physics today.  A most enigmatic quantity was introduced in thermodynamics in the 19th century, namely "entropy".  Entropy was one of the two principal abstract state variables of a thermodynamic system.  The other state variable was "internal energy".  In classical thermodynamics, it is not clear what "entropy" actually is.

Classical thermodynamics is concerned with the macroscopic distinction between heat and work.  The first law of thermodynamics essentially tells us that both heat and work are two forms of energy, which can be stored in, or extracted from, a system, and that there is a book-keeping principle:

The increase of the internal energy U of a system is equal to the net sum of received work W and the net sum of received heat Q.  This means that a system has a state variable "internal energy".

ΔU = ΔW + ΔQ

The second law is the one that distinguishes fundamentally heat from work.  Work can be turned into heat, but heat cannot be turned into work as such.  In fact, heat can only be turned into work partially, on the condition that a quantity of heat Q1 extracted at temperature T1 is compensated by a quantity of heat Q2 given to a body of temperature T2 and that:

Q2 / T2 > Q1 / T1

On the other hand, work can always entirely be converted into heat.

This implies that a system must have a second state variable, called entropy S, such that, when a system receives an amount heat Q at a temperature T, its entropy rises with the amount Q / T (or more):

ΔS > ΔQ/T

 The second law can take on different forms, which are all equivalent logically:

  • Kelvin: it is impossible to convert heat into work without at the same time wasting heat to a colder body.
  • Clausius: it is impossible to extract an amount of heat from a cold body and deliver only that same amount of heat to a hotter body.
  • The entropy of a closed system can only increase.

 From this classical formulation of thermodynamics, it is absolutely not clear what is the link between this quantity "entropy" and information.  However, the basic distinction is in fact already present: work is energy of which we know the microscopic organisation, and heat is energy of which we ignore the microscopic organisation.  The thermodynamic entropy is in fact the loss of knowledge (the increase in ignorance) about the microscopic organisation of work.   But this will become only apparent when we look at the statistical physics description of thermodynamics.

What is interesting, is that the second law of thermodynamics defines an arrow of time.  Ignorance about the micro-state of energy can only increase with time. 

In classical thermodynamics, the two state variables "internal energy" and "entropy" are abstract bookkeeping quantities of systems, that are needed in order to impose the respect of the first two laws of thermodynamics.  The "physical explanation" of thermodynamics in terms of molecules and so is called statistical mechanics.

Classical statistical mechanics

 In statistical mechanics, one is concerned with the "micro-states" of a system, that is, the mechanical states of each of its molecules and other microscopic aspects.  Given the enormous number of atoms and molecules in any macroscopic system, the amount of possible mechanical states in which all of these molecules can potentially be, is mind-bogglingly large.   Of course, for a given physical system with given macroscopic properties, not all micro-states are possible.   There are only certain micro-states which are compatible with the macroscopic properties.  Nevertheless, that amount of states is still very large.

If we know of a physical system that it has certain macroscopic properties, then there are still myriads of micro-states that could correspond to it.  And now one makes a fundamental hypothesis in statistical physics: one considers the ensemble of systems that are given by these micro-states, and one assumes them all to be equi-probable.  That is, one assumes a uniform distribution.  In fact, this assumption corresponds to the fact that we have no other information than the fact that these states are those that are possible.  It is the assumption of maximum entropy !  If we give each of these states a probability pi and we consider all the possible states, we have hence an imaginary ensemble of possible physical systems that are all compatible with our macroscopic system.

Once we consider this ensemble, it will turn out that the thermodynamic entropy is simply given by the Gibbs entropy:

S = kB. Σ pi ln(1/pi)

If there are N states, and we consider a uniform distribution, then pi = 1/N.  Hence:

S = kB.ln(N)

Apart from the fact that there is a constant kB and that one uses the natural logarithm instead of the base-2 logarithm, the thermodynamic entropy is equal to the information entropy of the event "drawing of a single micro state out of all possible micro-states compatible with the macro state".  In other words, the entropy is the information we are ignoring about the micro-state.   Heat is energy in such a way that the number of micro-states increases, while work is energy that doesn't augment the number of possible micro-states.  It is impossible to win information about the micro-state "by itself", which would be the case if heat were to be converted to work in a closed system.

 Maxwell's demon and Landauer's principle

Maxwell devised a Gedanken experiment that seemed to challenge the second law of thermodynamics.   It came to be known as Maxwell's demon.  The idea is this:  Consider two gas volumes, separated by a wall in which is a small hole with a shutter, directed by a little demon.  Initially, both gas volumes are at same pressure and temperature.  The gas molecules on both sides have a statistical distribution of velocities.  Each time a molecule faster than average comes from the left side, a little demon opens the shutter to let the molecule flow to the right side.  Each time a molecule, slower than average, comes from the right side, the little demon opens the shutter to let the molecule flow to the left side. 

After a while, the gas will be hotter on the right side than on the left side.  Heat has flown from a body of low temperature, to a body of high temperature, which is in contradiction with Clausius' formulation of the second law.

The question is where is the caveat ?

People have devised a lot of reasons why the demon couldn't do its work, from the movement of the shutter or the measurement of molecule velocities, but each time one found a way to reject the objection.  Except for one point: from the measurement of the molecules, the demon needs to store information in order to decide whether to open the shutter or not, and then, it needs to erase that information, to make this memory available for the next operation.  It turns out that erasing information generates (thermodynamic) entropy.  That is Landauer's principle:

Each time, from a memory, a bit of information is erased, one has operated an irreversible thermodynamic process that generates:

ΔS = kB ln(2)

thermodynamic entropy.  If the memory is at temperature T, it means that one has to waste at least an amount of heat Q = kB ln(2) / T

 In general, Landauer's principle is:

in order to erase an amount of information I from a physical memory, one has to do an irreversible process that generates a thermodynamic entropy equal to I, in thermodynamic units, that is: kB ln(2) I

The remarkable point is that it is not acquisition of information that is generating entropy, but the disposal of it.

The maximum entropy principle

 In statistical mechanics, there seems to be an arbitrary postulate on which everything is based.  With a given macroscopic state description comes a set of micro-states that are compatible with it.  Often it is postulated that one has to consider a probabilistic ensemble with a uniform distribution of probabilities over these states.  With this ensemble and Gibbs' formula, one arrives at the correct description of thermodynamic entropy.  But the question is where this uniform probability distribution comes from.

Jaynes rose this to a universal principle: the right ensemble of micro-states to use to describe an equilibrium is always the maximum entropy ensemble that satisfies the macroscopic boundary conditions.