While we have definitively determined that Chu is not a member of the APS’ Topical Group on Quantum Information (even though at least one of his Nobel co-Laureates is), I have since discovered that he is a 1970 graduate of the University of Rochester, esteemed alma mater of my father (BA English, 1966). What does this mean? Nothing, really, since he didn’t overlap with my dad, though I guess it does mean he spent four years of his life in Western New York (though in the inferior of the two cities in the region).
Archive for December, 2008
Now that the semester – and three successive major storms – has finally ended, I can publish a blog entry again. Yay!
So my choice of topic, perhaps appropriate for the holiday season, is ‘entropy.’ First off: what is it? Everyone has their own ‘pet’ definition, but the most common usually somehow involves the notion of disorder. Here are a few definitions of entropy:
- Entropy is a measure of how disordered a system is.
- Entropy is a measure of the amount of information we can gain from a system.
- Entropy is a logarithmic rescaling of multiplicity and thus a measure of the number of possible configurations a system may have.
- It is a mathematical representation of probability distributions and their relations to one another.
Disorder, however, is not the best way to discuss entropy. Tom Moore gives several examples in his excellent textbook Six Ideas That Shaped Physics of pairs of systems where the one we would most likely view as having greater disorder, actually has less entropy. Tom’s preferred definition is the third one I presented above. The second definition, of course, is the definition given by Claude Shannon and used by information theorists. The last is a purely mathematical notion, recently shown (by a new colleague of mine here at Saint A’s) to be related to finitary isomorphisms.
A consistent definition of entropy does not exist and I have argued we should be working toward one, but I have been met with disdain, likely because people are married to their own interpretations. In any case, the mathematical descriptions of the last three definitions given above are equivalent and, thus, perhaps a new mathematical description is not necessary. However, as Moore clearly demonstrates, dumping the idea of disorder is probably a good idea.
What I think might be a way to tie them all together, is reinterpreting all of those definitions within what is sometimes called the “interaction picture.” In other words, the universe is made up of lots of stuff that interacts with lots of other stuff. The “stuff” exchanges momentum and energy when interacting. Broadly speaking, entropy is the amount of information we can obtain, but of the interaction and the systems involved. In this definition, some interaction is required (perhaps between the observer and a system) in order to obtain any information. This then fits naturally into most interpretations of quantum mechanics (one might, naïvely, think of Heisenberg’s gamma-ray microscope, though that was long ago proven to be a problematic concept).
So, here’s a challenge for the holidays: give me your views on entropy. What do you see it as being? The prize for my favorite entry will be an autographed photo of a pickle.
OK, so it’s from MSNBC. It’s still funny.
This article says it all. For the umpteenth time.
…at least according to MSNBC.
This looks pretty cool. One of the neatest parts is that they consider the case of a cheating referee! The parallelism here is reminiscent of that in Quantum Tic-Tac-Toe though there is not necessarily any entanglement as there is in the latter case (superpositions do not necessarily imply entanglement!). Actually it’s a pretty straightforward application of Grover’s algorithm but has a pretty wild conclusion in the optimized case.