## Noncommutative Geometry

Before jumping into the topic for today, let me say a few words about Brexit: UK put a show of monstrous selfishness and hypocrisy: after colonizing half the globe, now they complain about immigrants?

Back to physics (and math). Last time I stated that geometry requires a generalization, so what does this all mean? There are many ways one can approach this, but let's do it in historical fashion and start with the duality:

Geometry - Algebra

It is informative to remember how ancient Greeks did geometry. For them everything (including the proofs) were a geometric construction with straightedge and compass and they had no concept of coordinates.

It was not until 1637 that geometry and algebra were married by Descartes with what we now call a Cartesian coordinate system. Subsequently mathematicians started realizing that geometry and algebra are nothing but distinct languages describing the very same thing. The first geometry theorem unknown to ancient Greeks was discovered in 1899 and the proof was done by purely algebraic arguments.

The power of algebra is higher than that of geometry because it is easier to formalize abstractions in algebra. In algebra one easily encounters noncommutativity and one example is operator non-commutativity in quantum mechanics. But if algebra is dual to geometry, what kind of geometric spaces would correspond to a non-commutative algebra? What does it mean that "the algebra of coordinates is non-commutative"?

The simplest example is that of a torus. Recall the old fashion arcade games where your character exits through the right side of the screen and re-enters though the left side? Similarly if you move past the top edge you re-emerge at the bottom. Topologically this is a torus. Now suppose you move in straight line in such a way that the ratio of your horizontal and vertical speeds is an irrational number. Slice the torus with trajectory lines respecting this ratio. What you get is a pathological foliation because all "measurable functions" are almost everywhere constant and there are no non-constant continuous functions.

Other pathological examples are: the space of Penrose tilings, deformations of Poisson manifolds, quantum groups, moduli spaces, etc.

From the mathematical side, one can explain away all those pathological cases one by one, but this is missing the forest because of the trees. The duality from above now becomes:

Quotient spaces - Non Commutative Algebra

where we replace the commutative algebra of constant functions along the classes of an equivalence relation by the noncommutative convolution algebra of the equivalence relation.

Basically it all boils down to a generalization of measure theory. It is well known that the proper way to generalize measure theory is by von-Neumann algebras, and this is how quantum mechanics enters the picture (although historically non-commutative geometry arose from quantum mechanics and the work to classify von-Neumann algebras).

Next time we are going to dive deeper into non-commutative geometry and we will encounter the Dirac operator.

## Norm and correlations

Continuing the discussion from last time, today I want to talk about the norm of a linear operator and it's implication for the maximum correlations which can be achieved in nature: Tsirelson's bound. The very same norm definition would later on play a key role in what would unexpectedly become in the end a "geometrization" of the Standard Model coupled with (unquantized) gravity.

In a Hilbert space, the definition of the norm of a bounded linear operator is:

$$||A|| = sup (||Au||/||u||)$$ for $$u\ne 0$$

The most important properties of the norm for bounded operators are the triangle inequality:

$$||A+B|| \leq ||A|| + ||B||$$

and a multiplication identity which guarantees the continuity of multiplication:

$$||AB|| \leq ||A|| ||B||$$

(can we call this a triangle inequality for multiplication?)

On the basis of the triangle inequality, one may be tempted to explore the association of the notion of physical distance with the notion of the norm of an operator in a Hilbert space, but this is a dead end. The triangle inequality for operators is essential for quantum mechanics because it ensures the usual notions of convergence in functional analysis (most of functional analysis follows from it). The name of this blog is elliptic composability, and the "elliptic" part follows from the triangle inequality above. If one imagines a quantum mechanics where the triangle inequality is reversed, then one arrives at the unphysical hyperbolic quantum mechanics based on split-complex numbers which violates positivity which in turns prevents the usual definition of probability as a positive quantity.

There turns out however to be a deep and completely counter-intuitive relationship between the "sup" in the norm definition and the notion of physical distance, but more on this in subsequent posts-don't want to spoil the surprise, I only want to whet the (mathematical) appetite a bit.

Now back to correlations. Suppose we have four operators $$\sigma_\alpha, \sigma_\beta, \sigma_\gamma, \sigma_\delta$$ such that:

$${\sigma_\alpha}^2 = {\sigma_\beta}^2 = {\sigma_\gamma}^2 = {\sigma_\delta}^2 = 1$$
and
$$[\sigma_\alpha, \sigma_\beta] = [\sigma_\beta, \sigma_\gamma] = [\sigma_\gamma, \sigma_\delta] = [\sigma_\delta, \sigma_\alpha] = 0$$

If we define an operator $$C$$ as follows:

$$C= \sigma_\alpha \sigma_\beta + \sigma_\beta \sigma_\gamma + \sigma_\gamma \sigma_\delta - \sigma_\delta \sigma_\alpha$$

Then it is not hard to show that:

$$C^2 = 4 + [\sigma_\alpha, \sigma_\gamma][\sigma_\beta, \sigma_\delta]$$

From the triangle inequalities we have in general that:

$$||[A, B]|| = ||AB - BA|| = ||AB + (-B)A|| \leq ||AB|| + ||-BA||$$
$$= ||AB|| + ||BA|| \leq ||A||||B|| + ||A||||B|| = 2||A||||B||$$

and so

$$|| [\sigma_\alpha, \sigma_\gamma]|| \leq 2 ||\sigma_\alpha|| ||\sigma_\gamma|| = 2$$
$$|| [\sigma_\beta, \sigma_\delta]|| \leq 2 ||\sigma_\beta|| ||\sigma_\delta|| = 2$$

Therefore:

$$||C^2|| \leq 4+4$$
$$||C|| \leq 2 \sqrt{2}$$

And this is Tsirelson's bound because $$C$$ appears in the left hand side of the CHSH inequality.

Now one can read about textbook derivation of Tsirelson bound in many places, but the key point is that quantum correlations have their origin in the notion of operator norms in a Hilbert space. Nowhere in the derivation we have used the notion of distance or causality. Quantum correlations are a mathematical consequence of the quantum formalism which are in turn is a consequence of considerations of composition and information. Quantum mechanics is nothing but composition and information, and correlations (both quantum and classical) are nothing but considerations of composition and information as well.

The usual way we understand classical correlations as generated by a common cause is a parochial view due to our classical intuition. Yes, a common cause can generate correlations, but correlation does not imply causation

So what does all of this have to do with the notion of  distance and that of space-time? To uncover the link we would need first to generalize the notion of a space in geometry. Crazy talk? Not when it is based on von Neumann algebras research which led to a Fields medal. Please stay tuned...

## Correlations vs. locality:Can you hear the shape of a drum?

Is Nature local or non-local? Those are battle lines between the epistemic and ontic camps. But can we approach the problem from a different angle? I don't remember the quote exactly, but Bell once stated something along the following line: quantum correlations cry for an explanation. Or do they? I will attempt to make the case for the contrary.

If you think quantum correlations (which go above Bell limit) are in need of an explanation, then very likely you are in the ontic, beable, non-local camp. Personally I am not in this camp, and I am not in anyone's camp. What I am trying to do is reconstruct quantum mechanics from physical principles and then mathematically arrive at the correct quantum mechanics interpretation.

So what do I know for now? Quantum mechanics is locality-independent. This means that considerations of locality have no role whatsoever in deriving quantum mechanics. Quantum correlations which go above Bell's limit are a mathematical consequence of the quantum formalism.  Do I find correlations above the Bell limit strange? Indeed I do, because as a living organism I am the result of a long natural selection process which favors classical intuition as a necessary tool for survival. However, as a physicist, it is not the correlations alone which are troublesome to me, but correlations over spatially separated experiments. And if quantum correlations are natural mathematical consequences of the quantum formalism, what is in deep need of an explanation is the very idea of distance.  This is a different paradigm from the one put forward by Bell.

We tend to take the idea of space or space-time, or locality, or neighborhood for granted because this how nature is and physics is an experimental science, but if everything is quantum mechanical at core, and if locality plays no role in deriving quantum mechanics, where does the metric tensor comes from? (I will attempt to show that this is the deep mystery, and not the correlations) The funny thing is that the answer is known and was arrived at by a completely unexpected route starting with a strange question: Can you hear the shape of a drum? Moreover, although there are exceptions, the answer is not well known to neither the quantum foundations community, nor to the string theory/high energy physics, and paradoxically, it is well known to mathematicians who uncovered an extremely rich mathematical domain: noncommutative geometry. I will start exploring this area in this post and continue the topic in subsequent ones.

So what is the shape of a drum question about? When you go to a symphonic concert, you can clearly identify pianos from trumpets, trumpets from drums, etc. Why? Because they sound different even when they play the very same note.

But now let's make the problem harder and pick the same type of instrument. The shape of the instrument determines the spectral characteristics of the sound. Can we solve the inverse problem? Do the eigenfrequencies uniquely determine the shape of the instrument? The answer is negative as counterexamples show. However we are onto something interesting here. Eigenvalues and eigenvectors naturally occur in quantum mechanics. Also although the answer is negative, we can tell an instrument from another, and therefore we must be missing just a little bit of information to solve the inverse problem. And if we are able to solve the inverse problem, we have succeeded into recovering the metric tensor information in a very different language and moreover this language is common to quantum mechanics as well.

Please stay tuned for the continuation next week.

## Interaction-free measurement

In earlier posts I have described my proposal for the solution of the measurement problem, in which a pair of quantum systems (system, observer) select a distinguished state by breaking an equivalence relationship. The physical reason of the equivalence relationship is the fundamental indeterminacy of quantum mechanics due to operator non-commutativity. The proposed mechanism preserves unitarity and does not require an interaction Hamiltonian.Of course, an interaction Hamiltonian can break the  the equivalence relationship as well, but today I want to focus on how interaction free measurements can occur in nature.

There are several proposals on interaction-free measurements but the most dramatic and well known one is due to Avshalom Elitzur and Lev Vaidman, in the form of Elitzur-Vaidman bomb tester. On a personal note, Avshalom is very pleasant down to earth non-conformist person who speaks truth to power, is a genuine truth seeker, and a champion for the lowly persons toiling to push the boundary of knowledge. Last year I was having lunch with him at a conference when he surprised me by offering to swap our speaking time slots. This was like offering to swap a shiny new Mercedes Benz with a beat down Toyota. I was too stunt to accept it, but I was deeply touched and I owe him a debt of gratitude for the random act of kindness.

Now back on the bomb tester, the setup is as follows: a bomb factory produces extremely sensitive bombs which explode when they interact with a single photon. However the factory is not perfect, and sometimes they manufacture duds and the quality assurance department of the factory has to eliminate those defective bombs. The question is: how to do it? In a classical physics universe, this would be an impossible task: the very act of testing the bomb would explode the good ones, and leave the bad ones intact. Now here comes quantum mechanics to the rescue. Suppose we have an interferometer like the one in the picture below, tuned in such a way that every incoming photon triggers a detection at D1 while D2 does not detect anything.

I am too lazy to write the required LaTeX formulas and I picked a picture which shows the actual quantum states along the arms of the interferometer. Now if a functional bomb is inserted on the top arm, it blocks the top photon path. If the photon takes the upper path, it will explode the bomb. If however it will take the lower path, the interference at the second beam splitter is prevented and the photon would be detected at either D1 or D2!

Detection at D2 would signal a working bomb was inserted and moreover we found out the information without exploding it!

But wait a minute? What would happen if a dud is inserted into the interferometer arm? The dud does not interact with the photon, and the detection would always occur at D1.

Now the next question for the quality assurance department is to figure out if there is a way to increase the detection rate of the good bombs without exploding them. This is again possible using quantum mechanics and the idea comes from the quantum Zeno effect or in more popular terms: a watched pot never boils. In the quantum world, this is an actual real effect, and not a psychological phenomena. In fact, using the quantum Zeno effect the efficiency of the bomb detector can be increased arbitrarily close to 100%.

Now if you think this is all nonsense theoretical musings, nature confirmed the reality of the argument. For details see this experimental paper. So what do we learn from the existence of interaction-free measurements? First, nature is quantum mechanical. There is simply no way classical physics can achieve interaction free measurements. Second, quantum measurements is not a naive process which can be explained away by finding some interaction Hamiltonian. True, there is an interaction taking place here, but this is between the photon and the detector, not between the photon and the bomb. Third, the role of the observer is essential: no observer means no collapse.

So how does my proposed solution to the measurement problem using the Gothendieck group construction fare against interaction free measurement? It passes with flying colors. Gothendieck group construction demands the construction of a Cartesian pair, and physically the pair can be identified with the quantum system and the observer. Also the collapse is a change in representation, not a unitary evolution. Last, the unitary evolution is completely preserved. The only strange part is the existence of many Hilbert spaces each corresponding to a potential outcome. Still, this proposal is not the many worlds interpretation. Since the Hilbert spaces are only mathematical constructs devoid of physical reality ("Quantum phenomena do not occur in a Hilbert space. They occur in a laboratory"-Asher Peres), we can embed all those Hilbert spaces into a single one and the quantum interpretation which arises out of the Gothendieck group construction is that of Copenhagen.