Well, I’ve been lax in posting as I’ve been busy getting ready for the March Meeting of the APS. But I have an idea that’s been percolating in my head in relation to the March Meeting and it forms the basis of an idea that I have that is the at the core of a hypothesis I’m presenting there.

The short end of it is that I have a theory based around the idea that Bell’s inequalities are merely another statement of the Second Law of Thermodynamics, the latter actually not a fundamental law, but rather merely a strong argument about probabilities (see, for example, Dan Schroeder’s Thermal Physics). In short (and there will be a paper up on the arXiv about this soon), here’s argument:

1. The entropy of mixing represents the entropy created when mixing two systems. It is always zero or positive since it is essentially another way of counting the configurations of the system. It is zero when the two systems are the same. As such it is a measure of ‘separation’ of the probability distributions of the two systems.

2. The relative (Shannon) entropy and its other forms the conditional and mutual entopies (see Nielsen and Chuang’s Quantum Computation and Quantum Information are also measures of the separation of probability distributions of two systems.

3. The data pipelining inequality implies that, if X -> Y -> Z is a Markov chain, then based on fundamental properties of Shannon entropies (again see Nielsen and Chuang) we can write H(X:Y) ≥ H(X:Z) as well as H(Y:Z) ≥ H(X:Z)

4. It follows trivially from 3. that

H(X:Y) + H(Y:Z) ≥ H(X:Z)

It is possible to form inequalities of the type derived by Cerf and Adami from this inequality. In addition, if we speak entirely in terms of relative information, by dividing through by the total relative information available, represented by a sum of each of these, we can write an inequality similar to Wigner’s form of Bell’s inequalities (see Wigner’s paper or Sakurai’s Modern Quantum Mechanics):

Pr(X:Y) + Pr(Y:Z) ≥ Pr(X:Z)

Is it possible Bell’s inequalities, which are entirely classical obviously (which is why they are violated), are just another way of writing the second law? The Shannon entropies are purely classical and the inequality above represents, essentially, the positivity condition for entropy inherent in the second law.

The thing is that, for quantum systems, we would want to speak in terms of the von Neumann entropy which is not wholly classical. In that sense it is not clear that we could even write such an inequality in these terms. But this begs the question, then: shouldn’t the definition of entropy be consistent across regimes?

That’s the question I want feedback on.

Advertisements