Preparing for March Meeting of APS

Well, I’ve been lax in posting as I’ve been busy getting ready for the March Meeting of the APS. But I have an idea that’s been percolating in my head in relation to the March Meeting and it forms the basis of an idea that I have that is the at the core of a hypothesis I’m presenting there.

The short end of it is that I have a theory based around the idea that Bell’s inequalities are merely another statement of the Second Law of Thermodynamics, the latter actually not a fundamental law, but rather merely a strong argument about probabilities (see, for example, Dan Schroeder’s Thermal Physics). In short (and there will be a paper up on the arXiv about this soon), here’s argument:

1. The entropy of mixing represents the entropy created when mixing two systems. It is always zero or positive since it is essentially another way of counting the configurations of the system. It is zero when the two systems are the same. As such it is a measure of ‘separation’ of the probability distributions of the two systems.

2. The relative (Shannon) entropy and its other forms the conditional and mutual entopies (see Nielsen and Chuang’s Quantum Computation and Quantum Information are also measures of the separation of probability distributions of two systems.

3. The data pipelining inequality implies that, if X -> Y -> Z is a Markov chain, then based on fundamental properties of Shannon entropies (again see Nielsen and Chuang) we can write H(X:Y) ≥ H(X:Z) as well as H(Y:Z) ≥ H(X:Z)

4. It follows trivially from 3. that

H(X:Y) + H(Y:Z) ≥ H(X:Z)

It is possible to form inequalities of the type derived by Cerf and Adami from this inequality. In addition, if we speak entirely in terms of relative information, by dividing through by the total relative information available, represented by a sum of each of these, we can write an inequality similar to Wigner’s form of Bell’s inequalities (see Wigner’s paper or Sakurai’s Modern Quantum Mechanics):

Pr(X:Y) + Pr(Y:Z) ≥ Pr(X:Z)

Is it possible Bell’s inequalities, which are entirely classical obviously (which is why they are violated), are just another way of writing the second law? The Shannon entropies are purely classical and the inequality above represents, essentially, the positivity condition for entropy inherent in the second law.

The thing is that, for quantum systems, we would want to speak in terms of the von Neumann entropy which is not wholly classical. In that sense it is not clear that we could even write such an inequality in these terms. But this begs the question, then: shouldn’t the definition of entropy be consistent across regimes?

That’s the question I want feedback on.

At long last….an encyclopedia on a toothpick

After a lengthy break I finally decided to squeeze in a post. I am nearly finished with Haruki Murakami’s Hard-boiled Wonderland and the End of the World which is a fascinating (and utterly over-the-top) novel that explores odd aspects of the mind. It involves some information theory here and there and at one point, a character describes how one could theoretically fit an encyclopedia on a toothpick. Simply encode each letter in the encyclopedia (in order that they appear) with a binary digit. Put all these digits in a row and convert this larger binary digit into base-10. Then throw a decimal point on the front and you have a fraction (rational if the encyclopedia is finite). Then simply put a notch on the toothpick at that fraction of the whole. Hmmmmmm………..

International Holiday

OK, this is cheeky, but today’s my birthday so I am declaring this an international holiday!! (OK it’s a Saturday, but…..) Normally I’m somewhat ambivalent about birthdays, but this year it’s on a weekend so I get to (almost) do whatever I want (when one has young children one finds that one does not always have time to do everything one wants to do, but they’re worth it).

Create a free website or blog at

Up ↑