### The Co-Natural Numbers in Haskell and Coq

I was playing around with the Peano numerals in Haskell (as I often do), because I remember briefly reading about a generalisation of exponentiation in a paper once when I didn't have time to read it. I decided I'd quickly look at it again now that I was in a mood to procrastinate.

```> data Nat = Z | S Nat deriving Show
```
This data type is usually identified with the peano numerals. In fact as we will see, in haskell it isn't quite the natural numbers. We take Z to mean 0 and take (Sn Z) or (S (S (S ... Z))) n times, to mean n. It's basically a unary number system where the length of the string gives us the number.

It is pretty straightforward to write the basic arithmetic operations on the peano numerals.

```> add x Z = x
```
```> mult x Z = Z
> mult x (S y) = add x (mult x y)
```
```> expt x Z = (S Z)
> expt x (S y) = mult x (expt x y)
```
Here we can see an extension of this series fairly easily. The only real question we might have is what to do with the base case. It turns out that (S Z) or 1 is a good choice.

```> exptexpt x Z = (S Z)
> exptexpt x (S y) = expt x (exptexpt x y)
```
Playing with exptexpt got me thinking about the fact that I wasn't dealing with natural numbers and I was curious what I could prove about them. In order to see that we don't actually have the natural numbers we can define the following:

```> inf :: Nat
> inf = S (inf)
```
This element is clearly not a natural number since it is not a valid recursive equation in the sense that there is no decreasing quantity. It is however a valid corecursive equation. It is always productive in the sense that it always has it's recursive reference inside of a constructor. We will always be able to "peel" something away from this function in order to reason about it. This will become obvious when we start looking at equational reasoning with infinity.

So this leads us to the question: what does inf+x equal? I'll prove it in pseudo-Coq. Let us make the following conjecture:

∀ x, inf + x = inf

```Theorem : ∀ x, inf + x = inf.
By coinduction:

case x = 0:
1a) inf + 0 = inf
2a) inf = inf
* true by the reflexive property of =
case x = S x'
1b) inf + S x' = inf
2b) S (inf + x') = inf
3b) S (inf + x') = S inf
4b) inf + x' = inf
* true as this is the coinductive hypothesis!
Qed.
```

Now the equality that we are using here is actually not term based equality. It is a coinductive type of equality known as bisimulation. It also must follow the rule that we can never use recursion unless we are inside a constructor for the type of the function we are using. The recursion is the coinductive hypothesis. The constructor prior to the use of the coinductive hypothesis is from the definition of bisimulation. Namely:

∀ x y, x = y → S x = S y.

The full definition in Coq of the bisimulation which we denote above as "=" and below as "CoEq" is:
```CoInductive CoEq : CoNat -> CoNat -> Prop :=
| coeq_base : forall x, CoEq Z Z
| coeq_next : forall x y, CoEq x y -> CoEq (S x) (S y).
```

The constructor coeq_next, which is used between step 3b and 4b above ensures that we satisfy our guardedness, or productivity condition and can safely "call" the coinductive hypothesis. What do I mean "call" and how is the "constructor" being used? We can see exactly by printing out the term in Coq that proves inhabitation of the type for our proposition.

```add_x_inf =
cofix add_x_inf (x : CoNat) : x + inf ~= inf :=
eq_ind_r (fun c : CoNat => x + c ~= c)
(eq_ind_r (fun c : CoNat => c ~= S inf)
(decomp_eql inf)
: forall x : CoNat, x + inf ~= inf
```

Here we see in gory detail a corecursive function that allows us to show this type is inhabited along with the corecursive call shown embedded in the constructor coeq_next. Sorry if it isn't very readable, it was constructed automagically from the proof script.

To see how this is all done in detail, check out the Coq proof script for the co-naturals. This might be interesting to those of you who want to see how to use coinductive reasoning on simple examples and how setoids work in Coq. You'll notice that the reasoning is slightly more cumbersome then what I've used here. I think this mostly comes down to the "cofix" tactic being unwieldy in coq, and making automation a hassle. I'm going to think about ways to fix this since it really is unnecessarily inconvenient.

Here are some helper functions which convert from positive integers into our representation so you can play with the code more easily.

```> to 0 = Z
> to (n+1) = (S (to n))
```
```> from Z = 0
> from (S x) = 1+(from x)
```

### Decidable Equality in Agda

So I've been playing with typing various things in System-F which previously I had left with auxiliary well-formedness conditions. This includes substitutions and contexts, both of which are interesting to have well typed versions of. Since I've been learning Agda, it seemed sensible to carry out this work in that language, as there is nothing like a problem to help you learn a language.

In the course of proving properties, I ran into the age old problem of showing that equivalence is decidable between two objects. In this particular case, I need to be able to show the decidability of equality over types in System F in order to have formation rules for variable contexts. We'd like a context Γ to have (x:A) only if (x:B) does not occur in Γ when (A ≠ B). For us to have statements about whether two types are equal or not, we're going to need to be able to decide if that's true using a terminating procedure.

And so we arrive at our story. In Coq, equality is som…

### Formalisation of Tables in a Dependent Language

I've had an idea kicking about in my head for a while of making query plans explicit in SQL in such a way that one can be assured that the query plan corresponds to the SQL statement desired. The idea is something like a Curry-Howard in a relational setting. One could infer the plan from the SQL, the SQL from the plan, or do a sort of "type-checking" to make sure that the plan corresponds to the SQL.

The devil is always in the details however. When I started looking at the primitives that I would need, it turns out that the low level table joining operations are actually not that far from primitive SQL statement themselves. I decided to go ahead and formalise some of what would be necessary in Agda in order get a better feel for the types of objects I would need and the laws which would be required to demonstrate that a plan corresponded with a statement.

Dependent types are very powerful and give you plenty of rope to hang yourself. It's always something of…

### Plotkin, the LGG and the MGU

Legend has it that a million years ago Plotkin was talking to his professor Popplestone, who said that unification (finding the most general unifier or the MGU) might have an interesting dual, and that Plotkin should find it. It turns out that the dual *is* interesting and it is known as the Least General Generalisation (LGG). Plotkin apparently described both the LGG for terms, and for clauses. I say apparently because I can't find his paper on-line.

The LGG for clauses is more complicated so we'll get back to it after we look at the LGG of terms. We can see how the MGU is related to the LGG by looking at a couple of examples and the above image. We use the prolog convention that function symbols start with lower case, and variables start with uppercase. The image above is organised as a DAG (Directed Acyclic Graph). DAGs are a very important structure in mathematics since DAGs are lattices.

Essentially what we have done is drawn an (incomplete) Hasse diagram f…