Skip to main content

The Logic of Space part III

Ok, so last time we left off with a very informal discussion about Venn diagrams and how they relate to Boolean Logic.

First let us do a little set theory and then we'll start drawing connections with the previous post to make it a bit more rigorous. We will stick with completely finite sets, so keep that in mind.

A set is basically a collection of distinguishable objects. Sets have no notion of the number of times an object is in them. They simply contain an object or they do not. A set (if it is finite) can be writen in terms of its elements, for instance: S = {a,b,c} is the set S with elements a,b and c.

A map can be thought of as arrows that leave from one set (the domain) and arive at another (the codomain).

We will also introduce a special set 2^S which is an exponential or a "homset" called hom(S,2). S will be jus t a collection of elements as above, and 2 will be a collection of the elements {0,1}. We can think of a homset as a collection of all maps from the elements of S into the elements of 2. In this particular case the map has some special properties because of 2. A map will either map an element to 0 or it maps it to 1. This means that we can fully describe a map from S to 2 by simply naming either the set that goes to 1 or the set that goes to 0. See the illustration below.



Since S is finite 2^S will also be finite. In fact if we use the naming trick above, where we associate the set of elements that goes to 1 with the map, the set of all maps can then be described as the set of all subsets of S. Try it out on paper with S={a,b,c}, 2^S = {{},{a},{b},{c},{a,b},{b,c},{b,a},{a,b,c}}.

Ok, so there are a couple of operations on sets that we will be interested in. One is ∩ which is pronounced "intersection" and ∪ which is pronounced "union". The intersection of two sets is simply all elements that occur in both sets. So if we let A={a,b,c,d} and B={c,d,e,f} then A∩B={c,d}. The union of two sets is just every element that is in both of the sets. so A∪B={a,b,c,d,e,f}.

Ok now as it turns out we want to talk about another structure called a topology. A topology is a set, in our case S together with a set of subsets of S. We will write it as the tuple <S ,T>.

Now let us go back to Boolean Algebras momentarily. We had a Boolean Algebra B being a tuple <B ,∧,∨,0,1>. Now let us identify B with 2^S, ∧ with ∩ and ∨ with ∪. By doing this we have recovered a Boolean algebra simply by looking at the topology of all S together with all its subsets and the intersection and union operations!

Ok, that is great, but it doesn't stop there. Topologies are not restricted to <s ,2^S> We can look at other subsets of S. And then it might be interesting to ask whether we will get something like a Boolean algebra. It turns out of course that you don't get Boolean algebras with just any set of subsets, but you do get some kind of algebra. The thing that you always get is called a lattice.

As it turns out Quantum Mechanics also forms a kind of logic if we start with Hilbert space and use the idea of associating the spaces with a logic. Its more than just the union and intersection of sets however, we also need the concept of an algebraic closure operator (an operator having as properties X ⊂ C(X), C(C(X)) = C(X), and X ⊂ Y → C(X) ⊂ C(Y) ) since the join of hilbert spaces is a superset of the union. This is actually the source of "entanglement" in quantum mechanics which leads to so much confusion. The algebraic closure in the case of Hilbert spaces is the space spanned by the basis vectors of both sets. From this we can then look at the lattice structure and come to find that it isn't much like a Boolean Algebra at all, but is something called an "Orthomodular Lattice" a bounded lattice that is also an ortho lattice (has a complementation operation) and satisfies the modularity condition. We can write the algebra as the tuple <S,∧,∨,',0,1 >

The modularity condition is very interesting but visually producing spaces that exhibit it is a bit tricky. I'll try to draw some nice pictures of spaces that display this behavior in an intuitive way in part IV.

P.S. I lied about getting to Heyting algebras didn't I! :)

Ok, I'll say something about it. When we talk about our typical notions of spaces as we learned them in high school we run into some differences with the boolean algebraic description. The topology that we provided for our boolean space 2^S has an unusual property. If we look at the dual of 2^S in the sense that we exchange 0 for 1 in our naming of sets for hom(S,2) nothing changes. If we have a complement operation in our topology, that is an operator ¬ such that ¬A = S⁄A where ⁄ produces a set with every element of S that is not also in A. The set of all complements of sets in 2^S is 2^S. We say in topology that the tuple <S ,T> is the tuple of a set S, and its open sets T. The complement of the open sets are the closed sets. In 2^S every thing is both closed and open! In fact the silly mathematicians even say that they are "clopen".

When we took algebra we learned that [0,1] is not the same as (0,1). That is the set that includes its boundary is not the same as the one that does not. This leads us to a different topology in which we have a clear distinction between open and closed sets. In the standard description of space in high school Algebra (that is, the topology induced by the euclidean distance measure) there are only two sets that have the property of being clopen (can you guess what they are?).

So in a Heyting algebra we don't even bother with the notion of complementation since it leads to complications. Instead we introduce a symbol → into our algebra. We then have the algebra <S,∧,∨,→>. The symbol → is pronounced "implies".

I promise to tell you more later :)

Comments

Popular posts from this blog

Teagrey

I was ironing my shirt today, which I almost never do. Because of this I don't have an ironing board so I tried to make a make-shift ironing board on my floor using a towel and some books. I grabbed the heaviest books on the nearest shelf, which happened to be Organic Chemistry, Stalingrad and an annotated study bible containing the old and new testament. As I pulled out the bible, a flower fell out which had been there for over 17 years now. I know that because it was put there by my first wife, Daniel, who killed herself in April about 17 years ago. It fell from Thessalonians to which it had been opened partially. Highlighted was the passage: "Ye are all sons of the light and sons of the day." I guess the passage gave her solace. Daniel was a complicated woman. She had serious mental health issues which plagued her for her entire life. None of them were her fault. She was dealt an absolutely awful hand in life, some truly nasty cards. She had some considerable c

Managing state in Prolog monadically, using DCGs.

Prolog is a beautiful language which makes a lot of irritating rudimentary rule application and search easy. I have found it is particularly nice when trying to deal with compilers which involve rule based transformation from a source language L to a target language L'. However, the management of these rules generally requires keeping track of a context, and this context has to be explicitly threaded through the entire application, which involves a lot of irritating and error prone sequence variables. This often leads to your code looking something a bit like this: compile(seq(a,b),(ResultA,ResultB),S0,S2) :- compile(a,ResultA,S0,S1), compile(b,ResultB,S1,S2). While not the worst thing, I've found it irritating and ugly, and I've made a lot of mistakes with incorrectly sequenced variables. It's much easier to see sequence made explicitly textually in the code. While they were not designed for this task, but rather for parsing, DCGs turn out to be a conveni

Decidable Equality in Agda

So I've been playing with typing various things in System-F which previously I had left with auxiliary well-formedness conditions. This includes substitutions and contexts, both of which are interesting to have well typed versions of. Since I've been learning Agda, it seemed sensible to carry out this work in that language, as there is nothing like a problem to help you learn a language. In the course of proving properties, I ran into the age old problem of showing that equivalence is decidable between two objects. In this particular case, I need to be able to show the decidability of equality over types in System F in order to have formation rules for variable contexts. We'd like a context Γ to have (x:A) only if (x:B) does not occur in Γ when (A ≠ B). For us to have statements about whether two types are equal or not, we're going to need to be able to decide if that's true using a terminating procedure. And so we arrive at our story. In Coq, equality is