Sunday, March 26, 2017

Time as a continous functor

To recall from prior posts, a functor maps objects to objects and arrows to arrows between two categories. In other words, it is structure preserving. In the case of a monoidal category, suppose there is an arrow * from \(C\times C \rightarrow C\). Then a functor T makes the diagram below commute:

This is all fancy abstract math which has a simple physical interpretation when T corresponds to time evolution: the laws of physics do not change in time. Moreover it can be shown with a bit of effort and knowledge of C* algebras that Time as a functor = unitarity.

But what can we derive from the commutative diagram above? With the additional help of two more very simple and natural ingredients we will be able to reconstruct the complete formalism of quantum mechanics!!! Today I will introduce the first one: time is a continuous parameter. Just like in group theory adding continuity results in the theory of Lie groups we will consider continous functors and we will investigate what happens in the neighborhood of the identity element.

In the limit of time evolution going to zero T becomes the identity. For infinitesimal time evolution we can then write:

\(T = I + \epsilon D\)

We plug this back into the diagram commutativity condition \(T(A)*T(B) = T(A*B)\) and we obtain in first order the chain rule of differentiation:

\(D(A*B) = D(A)*B + A*D(B)\)

There is not a single kind of time evolution and \(D\) is not unique (think of various hamiltonians). There is a natural transformation between different time evolution functors  and we can express D as an operation like this: \(D_A = A\alpha\) where \((\cdot \alpha \cdot)\) is a product.

\(\alpha : C\times C \rightarrow C\)

Then we obtain the Leibniz identity:

\(A\alpha (B * C) = (A\alpha B) * C + B * (A \alpha C)\)

This is extremely powerful, as it is unitarity in disguise.  Next time we'll use the tensor product and the second ingredient to obtain many more mathematical consequences. Please stay tuned.

Sunday, March 19, 2017

Monoidal categories and the tensor product

Last time we discussed the category theory product which forms another category from two categories. Suppose now that we start with one category \(C\) and form the product with itself \(C\times C\). It is natural to see if there is a functor from \(C\times C\) to \(C\). If such a functor exists and moreover it respects associativity and unit elements, then the category \(C\) is called a monoidal category. By abuse of notation, the functor above is called the tensor product, but this is not the usual tensor product of vector space. The tensor product of vector space is only one concrete example of a monoidal product. To get to the ordinary tensor product we need to inject physics into the problem. 

The category \(C\) we are interested in is that of physical systems where the objects are physical systems, and arrows are compositions of physical systems. The key physical concepts needed are that of time and dynamical degree of freedom inside Hamiltonian formalism.

Time plays an distinguished role in quantum mechanics both in terms of formalism (remember that there is no time operator) and in how quantum mechanics can be reconstructed. 

The space in Hamiltonian formalism is a Poisson manifold which is not necessarily a vector space but because the Hilbert space \(L^2 (R^3\times R^3)\) is isomorphic to \(L^2 (R^3 ) \otimes L^2 (R^3 )\) let's discuss monoidal categories for vector spaces obeying an equivalence relationship. Hilbert spaces form a category of their own and there is a functor mapping physical systems into Hilbert spaces. This is usually presented as the first quantum mechanics postulate: each physical system is associated with a complex Hilbert space H.

For complete generality of the definition of the tensor product we consider two distinct vector space V and W for which we first consider the category theory product (in this case the Cartesian product) but for which we make the following identifications:
  • \((v_1, w)+(v_2, w) = (v_1 + v_2, w)\)
  • \((v, w_1)+(v, w_2) = (v, w_1 + w_2)\)
  • \(c(v,w) = (cv, w) = (v, cw)\)
For physical justification think of V and W as one dimensional vector spaces corresponding to distinct dynamical degrees of freedom. Linearity is a property of vector spaces and we expect this property to be preserved if vector spaces are to describe nature. Bilinearity in the equivalence relationship above arises because the degrees of freedom are independent.

Now a Cartesian product of vector spaces respecting the above relationships is a new mathematical object: a tensor product.

The tensor product is unique up to isomorphism and respects the following universal property:

There is a bilinear map \(\phi : V\times W \rightarrow V\otimes W\) such that given any other vector space Z and a bilinear map \(h: V\times W \rightarrow Z\) there is a unique linear map \(h^{'}: V\otimes W \rightarrow Z\) such that the diagram below commutes.

This universal property is very strong and several mathematical facts follows from it: the tensor product is unique up to isomorphism (instead of Z consider another tensor product \(V\otimes^{'}W\) ), the tensor product is associative, and there is a natural isomorphism between  \(V\otimes W\) and \(W\otimes V\) making the tensor product an example of a symmetric monoidal category, just like the category of physical systems under composition.

This may look like an insignificant trivial observation, but it is extremely powerful and it is the starting point of quantum mechanics reconstruction. On one hand we have composition of physical systems and theories of nature describing physical systems. On the other hand we have dynamical degrees of freedom and the rules of quantum mechanics. The two things are actually identical and each one can be derived from the other. To do this we need one additional ingredient: time viewed as a functor. Please stay tuned.

Monday, March 13, 2017

Category Theory Product

Before we discuss this week's topic, I want to make two remarks from the prior posts content. First, why we need natural transformations in algebraic topology? Associating groups to topological spaces (which incidentally describe the hole structure of the space) is done by the use of functors. Different (co)homology theories are basically different functors, and their equivalence is the same as proving the existence of a natural transformation. Second, the logic used in category theory is intuitionistic logic where truth is proved constructively. Since this is mapped into computer science by the Curry-Howard isomorphism, the fact that some statements have no constructive proof is equivalent with a computation running forever. In computation theory one encounters the halting problem. If the halting problem were decidable then category theory would have been mapped to ordinary logic instead of intuitionistic logic.

Now back to the topic of the day. We are still in pure math domain and we are looking at mathematical objects from 10,000 feet disregarding their nature and observing only their existence and their relationships (objects and arrows). The first question one asks is how to construct new categories from existing ones? One way is to simply reverse the direction of all arrows and the resulting category is unsurprisingly called the opposite category (or the dual). Another way is to combine two category into a new one. Enter the concept of a product of two categories: \(\mathbf{C}\times \mathbf{D}\). In set theory this would correspond with to the Cartesian product of two sets. However we need to give a definition which is independent of the nature of the elements. Moreover we want to give it in a way which guarantees uniqueness up to isomorphism. 

The basic idea is that of a projection from the elements of \(\mathbf{C}\times \mathbf{D}\) back to the elements of \(\mathbf{C}\) and \(\mathbf{D}\). So how do we know that those projections and the product is unique up to isomorphism? Suppose that there is another category \(\mathbf{Y}\) with maps \(f_C\) and \(f_D\). Then there is a unique map \(f\) such that the diagram below commutes

This diagram has to commute for all categories \(\mathbf{Y}\) and their maps \(f_C\) and \(f_D\). From this definition, can you prove uniqueness of the product up to isomorphism? It is a simple matter of "diagram reasoning". Just pretend that Y is now the "true incarnation" of the product. You need to find a morphisms f from Y to CxD and a morphism g from CxD to Y such that \(g\circ f =1_{C\times D}\), \(g\circ f = 1_Y\). See? Category theory is really easy and not harder than linear algebra.

Now what happens if we flip all arrows in the diagram above? We obtain a coproduct category \(\mathbf{C}\oplus \mathbf{D}\) and the projections maps become injection maps. 

OK, time for concrete examples:

  • sets: product = Cartesian product, coproduct = disjoint union
  • partial order sets: product = greatest lower bounds (meets), coproduct = least upper bounds (joins)
So where are we now? The concept of the product is very simple, but we need it as a stepping stone to the concept of tensor product and (symmetric) monoidal category. Why? Because physical systems form a symmetric monoidal category. Using categorical arguments we can derive the complete mathematical properties of any theory of nature describing a symmetric monoidal category. And the answer will turn out to be: quantum mechanics. Please stay tuned.

Saturday, March 4, 2017

The Curry–Howard isomorphism

Category theory may seem vary abstract and intimidating, but in fact it is extremely easy to understand. In category theory we look at concrete objects from far away without any regard for the internal structure. This is similar with Bohr's position on physics: physics is about what we can say about nature, and not decide what nature is. Surprisingly, a lot of information about the objects in category theory is derivable from the behavior of the objects and this is where I am ultimately heading with this series on category theory.

Last time I mentioned the origin of category theory as the formalism to clarify when two homology theories are equivalent. But category theory can be started from two other directions as well, and those alternative viewpoints help provide the intuition needed to navigate the abstractions of category theory. One thread of discussion starts with the idea of computability and the work of Alonzo Church and Alan Turing. Turing was Church's student and each started an essential line of research: lambda calculus and universal Turing machines.  Those later grew into one hand functional languages like Java script, and the other hand into object oriented languages like C++. What one can do with lambda calculus can be achieved with universal Turing machines, and the other way around. The essential idea of computer programming is to build complex structures out of simpler building blocks. Object oriented programming starts from the idea of packaging together actions and states. An object is a "black box" containing actions (functions performing computations) and information (the internal state of the object). Functional programming on the other hand lacks the concept of an internal state and you deal only with functions which take an input, crunch the numbers, and then produce an output. The typical example is FORTRAN: FORmula TRANslation (from higher level human understandable syntax into zeroes and ones understandable by a machine).

The second direction one can start category theory is intuitionistic logic and the foundation of set theory. The problem of naive set theory is that one can create paradoxes like Russel's paradox: the set of all sets which are not members of themselves. The solution Russel proposed was type theory. Types introduce structure to set theory preventing self-referential constructions. In computer programming, types are semantic rules which tell us how to understand various sequences of zeros and ones from computer memory as integers, boolean variables, etc.

In intuitionistic logic statements are not true by simply disproving their falsehood, but they are true by providing an explicit construction. Truth must be computed and the parallel with computer programming is obvious. There is a name for this relationship, the Curry-Howard isomorphism. The mathematical formalism needed to rigorously spell out this correspondence is category theory. At high level:
  • proofs are programs
  • propositions are types
More important is that we can attach logical and programming meaning to category theory constructions which helps dramatically reduce the difficulty of category theory to that of elementary linear algebra. 

There are two additional key points I want to make. First category theory ignores the structure of the objects: they can be sets, topological spaces, posets, even physical systems. As such uniqueness is relaxed in category theory and things are unique up to isomorphisms. Second, we are strengthening uniqueness by seeking universal propertiesThis gives category theory its abstract flavour: the generalization of standard mathematical concepts in category theory involve diagrams which must commute. The typical definition is something like: "if there is an "impostor" which claims to have the same properties as the concept being defined, then there exist a so and so isomorphism such that a certain diagram commutes which guarantees that the impostor is nothing but a restatement of the same concept up to isomorphism". Next time I will talk about the first key definition we need from category theory, that of a product, and by flipping the arrows that of a coproduct.    

Tuesday, February 28, 2017

Objects and arrows

With one day delay, let's continue the discussion about category theory. One way to look at category theory is as a generalization of the notion of equivalence: category theory = equivalence on steroids

It is informative to look at the original motivation for category theory and also to look at a problem around 1900. Suppose you go back in time without knowing any modern math except group theory and you are aware of Mobius strip, and Klein bottle. Your task is to try to figure out what else is possible? In other words, classify all two dimensional surfaces. Who can help you on this quest? Well, clothes are two dimensional surfaces made by tailors. How do they make them? By two operations: cutting and stitching. Knowing group theory, you realize cutting and stitching are opposite operations, and they do respect the axioms of a group. This is how homology theory actually came from: associating groups with topological spaces in order to classify them. Now fast forward to 1940s, several homology theories were known and the problem was why the groups involved in them are the same? How do we axiomatize homology theory and how do we know if two homologies are equivalent? The answer lies in the concept of natural transformation which requires the concept of functor, which in turn needs the idea of a category. 

So what is a category? A category consists of objects and morphisms (arrows) such that the morphisms can be composed. Here are some examples:

-examples from math:
  • sets and functions
  • groups and group homomorphisms
  • Hilbert space and operators
  • partial order sets and monotone functions
  • manifolds and cobordisms
-examples from logic
  • propositions and proofs
-example from physics
  • physical systems and physical processes
-examples from computer science
  • data types and programs
Now a functor maps a category to another category by mapping objects to objects and arrows to arrows in a way that preserves structure. This is how for example in algebraic topology we associate groups to topological spaces. 

A natural transformation is a arrow (morphism) between functors subject to some (natural) conditions.  

Apart of naturality, another key concept is universality which means a unique (up to an isomorphism) solution to problems of constructions. We will encounter that when we will express quantum mechanics in category formalism.

Category theory reveals surprising relationships:
  • Cartesian  products of sets are like greater lower bounds of partial order sets
  • Proofs in logic are like programs in functional programming
Back to quantum mechanics, unitary evolution preserves information and it should be no surprise that there quantum information can be represented in diagramatic fashion. However this is not the path I am going to take and I will make use of universality in deriving quantum mechanics from a simple principle - composition: a theory T describing two physical systems A and B must described the composite system A+B as well. This is very intuitive principle but in the formalism of category theory it has extremely powerful mathematical consequences, spelling out the complete internal details of the theory T. Quantum mechanics comes out of this in its full detail. 

Please stay tuned...

Sunday, February 19, 2017

Monoids: the root of it all

Let's start talking about category theory. We will start from set theory and in the end try to get away from it. The first thing we need to discuss is magma. Basically you have a binary operation on a set and that's all: \(M \times M\rightarrow M\). One problem with magmas is that there is no associativity. Now not all mathematical operations lacking associativity are inherently primitive. Think of Lie algebras: the operation is not associative. However there you have something else: the Jacobi identity. But a pure magma without any additional structure is a rather inert object. The other problem with magmas is the lack of a unit. Add associativity and a unital element and category theory comes alive. 

To link the discussion to physics, nature obeys the structure of a (commutative) monoid: two physical systems can be composed into a larger physical system:
- composition is the binary operation 
- associativity guarantees our ability to reason about physical systems regardless of how we split a physical system into subsystems: quantum mechanics is valid for both an electron or an atom containing an electron
- the unital element is nothingness: composition with nothing leaves the original physical system intact.

In later posts I will show how quantum mechanics is a logical consequence of the commutative monoid above. In other words, quantum mechanics is inescapable and nature is quantum all the way.

Back on monoids, let's fall back on the usual example: composable functions: the image of a function is the domain of the next function. The link with programming is obvious: the output of one computation is plugged in as the input of another computation. As a side note, because of this functional programming is best explained in the language of category theory. When we talk function composition we usually write: \(f \circ g\)  which means \(f(g(x))\). To jump in abstraction and eliminate the nature of the elements considerations, there is an elementary trick to help navigate complex composition chains: call \(\circ\): AFTER like this: \(f~composed~with~g = f\circ g = f~ AFTER~ g\)

Now let's review the usual properties of injectivity and surjectivity:

Injectivity: for any elements \(x, x^{'}\), a function is injective if \(f(x) = f(x^{'}) ~implies~ x=x^{'}\)
Surjectivity: for any \(y\) in the range, there is an \(x\) in the domain such that \(f(x)=y\)

So how can we abstract this away and eliminate the talk about the elements? The corresponding category theory concepts are monic and epic:

Monic: a morphism is monic if for any \(g, h\) \(f\circ g = f\circ h ~implies~g=h\)
Epic: a morphism is epic if for any \(g, h\) \(g\circ f = h\circ f ~implies~g=h\)

Can you prove that if \(f:X\rightarrow Y\) then \(f\) is injective if and only if it is monic and it surjective if and only if it is epic? The proof can be found in many places but it is instructive to try to prove it yourself without looking it up first as this will help you better understand category theory. 

The last point I want to make today is that in category theory we move away from functions into abstract morphisms. The key point of morphisms is that they preserve mathematical structures. As such they can be used to jump between categories of very different nature. This is how category theory is a unifying structure of mathematics where the same patterns of reasoning can be replicated from logic to computer science, to algebraic topology, to quantum mechanics.

To be continued...

Sunday, February 12, 2017

A new way to look at mathematics

I want to start today  a series of posts about category theory. This is a vast area of mathematics which unifies logic, computer programming, combinatorics, cohomology, etc, and quantum mechanics into a cohesive paradigm. It also settles the problem of interpretation for quantum mechanics. By its very construction category theory has no need for any realism baggage. The entire mathematics can be expressed not in the language of sets (which are abstractions based on our classical intuition) but in the language of categories free of any considerations about the nature of elements. Regarding physics, the paradigm of category theory is best expressed by a famous Bohr quote:

"It is wrong to think that the task of physics is to find out how Nature is. Physics concerns what we say about Nature."

Let me start slow. The usual usage of math is on the practical side to solve problems. How many times did we hear the lazy student complaint: why should we learn this? Math is not about memorization and math is very easy once we absorb its content. Learning math is a journey in mastering abstractions and general ways of reasoning. For example when you learn about Lie groups you can extract a lot of key result by elementary methods simply by studying matrices. However you hit a wall with octonions because they are not associative and do not admit a matrix representation for this very reason. In turn this precludes the proper understanding of exceptional Lie groups.

Or consider a simpler example, topology. A lot of functional analysis can be done using the concept of distance and metric spaces. For example a space in \(R^n\) is compact iff it is closed and bounded. Then the metric spaces are generalized by the concept of topological spaces which are based on the idea of neighborhoods, unions, and intersections. In this case compactness is defined much more abstractly: a space is compact iff any open covering has a finite subcover. 

A similar thing happens in category theory. Patterns of reasoning in various mathematical domains are abstracted away in a formalism which does not care about the nature of the elements. On one end this is harder and to help navigate this in the beginning you hold on particular examples; the typical examples are functions. However at some point you let go of the examples just like in topology you let go the notion of distance. At that point you learn to reason properly in category theory and a lot can be achieved in this way. Then we can make the journey backwards from abstract to concrete. There is a big bonus in this: we have the flexibility to pick the concrete examples we want. And in our case we will pick quantum mechanics. Quantum mechanics is best and most naturally expressed in the language of category theory. Goodbye sets, goodbye classical realism, let the category journey begin. Please stay tuned.