Entropy

Now that the semester – and three successive major storms – has finally ended, I can publish a blog entry again.  Yay!

So my choice of topic, perhaps appropriate for the holiday season, is ‘entropy.’  First off: what is it?  Everyone has their own ‘pet’ definition, but the most common usually somehow involves the notion of disorder.  Here are a few definitions of entropy:

  1. Entropy is a measure of how disordered a system is.
  2. Entropy is a measure of the amount of information we can gain from a system.
  3. Entropy is a logarithmic rescaling of multiplicity and thus a measure of the number of possible configurations a system may have.
  4. It is a mathematical representation of probability distributions and their relations to one another.

Disorder, however, is not the best way to discuss entropy.  Tom Moore gives several examples in his excellent textbook Six Ideas That Shaped Physics of pairs of systems where the one we would most likely view as having greater disorder, actually has less entropy.  Tom’s preferred definition is the third one I presented above.  The second definition, of course, is the definition given by Claude Shannon and used by information theorists.  The last is a purely mathematical notion, recently shown (by a new colleague of mine here at Saint A’s) to be related to finitary isomorphisms.

A consistent definition of entropy does not exist and I have argued we should be working toward one, but I have been met with disdain, likely because people are married to their own interpretations.  In any case, the mathematical descriptions of the last three definitions given above are equivalent and, thus, perhaps a new mathematical description is not necessary.  However, as Moore clearly demonstrates, dumping the idea of disorder is probably a good idea.

What I think might be a way to tie them all together, is reinterpreting all of those definitions within what is sometimes called the “interaction picture.”  In other words, the universe is made up of lots of stuff that interacts with lots of other stuff.  The “stuff” exchanges momentum and energy when interacting.  Broadly speaking, entropy is the amount of information we can obtain, but of the interaction and the systems involved.  In this definition, some interaction is required (perhaps between the observer and a system) in order to obtain any information.  This then fits naturally into most interpretations of quantum mechanics (one might, naïvely, think of Heisenberg’s gamma-ray microscope, though that was long ago proven to be a problematic concept).

So, here’s a challenge for the holidays: give me your views on entropy.  What do you see it as being?  The prize for my favorite entry will be an autographed photo of a pickle.

Advertisements

Comment (obtuse, impolite, or otherwise "troll"-like comments may be deleted)

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: