Skip to main content

Type Checking and Cyclic Proof

Standard functional programming languages like Haskell and SML use polymorphic type systems. These type systems however, are not sound. What this means is that the type-systems themselves can not be used to prove properties of the software in the sense of "total correctness". To see that this is true, we can get a "proof" (read program) of inhabitation of any arbitrary type A, by simply using the program:

data Bot {- Look Ma! No constructors! -}
bot :: Bot
bot = bot

Clearly when we say that bot is an inhabitant of Bot, we dont mean that it actually produces a value of type Bot, since there aren't any as Bot has no constructors! We can easily use this type of proof to prove something like A ∧ ¬ A which leads to a pretty degenerate logic. However, the type system is still *useful* in the sense that if the program ever *does* terminate, it's sure to do so with the appropriate type. This means we can get the full class of Turing complete programs, a very useful benefit.

In Constructive Type Theory, we need to keep stronger guarantees. For programming languages such as Coq, we use syntactic restrictions to ensure that programs terminate (or coterminate). This however has the annoying feature that a lot of programs which are obviously (co)terminating will be rejected simply because of syntax.

Cyclic proofs give a method of describing inductive or coinductive proofs without requiring that we demonstrate the fact that our term (co)terminates up front. We can defer the proof until later. The huge advantage to this is that we can use ideas from supercompilation, as a form of proof normalisation and then show that the syntactic termination criteria are satisfied for the resulting transformed proof, rather than for the original.

As an example, take the following program (with the usual list, cons notation from Haskell):
codata CoNat = Z | S CoNat
 data List = [] | (:) CoNat List

 plus :: CoNat -> CoNat
 plus Z y = y
 plus (S x) y = S (plus x y)

 sum :: List -> CoNat
 sum [] = Z
 sum (x:xs) = plus x (sum xs)

Now, we can associate with this the following (infered) pre-proof type tree:



Now, we can use the usual super-compilation manipulations on this type tree, including beta-reduction, etc. to arrive at this new tree:



This is actually a proof, rather than a pre-proof as can be verified syntactically. It satisfies the guardedness condition of coinduction, and the structural recursion condition for induction.

From the proof above, we can derive the following program, which is syntactically sound.

sum [] = Z
 sum (Z:xs) = sum xs
 sum (S:xs) = S(f x xs)

 f Z xs = sum xs
 f (S x) xs = S(f x xs)

This process is basically an extended form of cut-elimination where we can extend the applicability of cut-elimination since we don't directly use induction rules, but instead we use cycles in the proof. We can then work with transformations over a larger class of things which are similar to normalisation.

There are a lot of advantages to this approach. In the first program our function 'sum' did not meet the guardedness condition, which means it would not be admissible in Coq, despite being perfectly correct (as it is in fact productive). Using pre-proofs we can defer proof, which gives us better compositionality. We can even use higher order functions which are not in general correct, on particular functions to derive programs which are totally correct.

In addition, we can decide only to show total correctness for regions of a program, rather than the entire program. We could decide that only certain regions require total correctness, and freely mix total correctness with partial correctness.

There is still a ton of work to be done in this area. It would be nice to know what proof transformation rules coupled with which algorithms can solve various classes of problems. Kamendantskaya has a very interesting class of productive functions which, I believe, could be found using a particular proof transformation algorithm. I'd like to have this algorithm and a proof that it works. In addition, I'd like to have more examples where this can be used to enhance compositionality (I'm thinking of filter functions in particular, where this might come in handy).

Sorry if this blog post is a bit "whirl-wind". I intend to lay out the entire theory in a slower and better motivated way later.

Comments

Popular posts from this blog

Managing state in Prolog monadically, using DCGs.

Prolog is a beautiful language which makes a lot of irritating rudimentary rule application and search easy. I have found it is particularly nice when trying to deal with compilers which involve rule based transformation from a source language L to a target language L'. However, the management of these rules generally requires keeping track of a context, and this context has to be explicitly threaded through the entire application, which involves a lot of irritating and error prone sequence variables. This often leads to your code looking something a bit like this: compile(seq(a,b),(ResultA,ResultB),S0,S2) :- compile(a,ResultA,S0,S1), compile(b,ResultB,S1,S2). While not the worst thing, I've found it irritating and ugly, and I've made a lot of mistakes with incorrectly sequenced variables. It's much easier to see sequence made explicitly textually in the code. While they were not designed for this task, but rather for parsing, DCGs turn out to be a conveni

Decidable Equality in Agda

So I've been playing with typing various things in System-F which previously I had left with auxiliary well-formedness conditions. This includes substitutions and contexts, both of which are interesting to have well typed versions of. Since I've been learning Agda, it seemed sensible to carry out this work in that language, as there is nothing like a problem to help you learn a language. In the course of proving properties, I ran into the age old problem of showing that equivalence is decidable between two objects. In this particular case, I need to be able to show the decidability of equality over types in System F in order to have formation rules for variable contexts. We'd like a context Γ to have (x:A) only if (x:B) does not occur in Γ when (A ≠ B). For us to have statements about whether two types are equal or not, we're going to need to be able to decide if that's true using a terminating procedure. And so we arrive at our story. In Coq, equality is

Teagrey

I was ironing my shirt today, which I almost never do. Because of this I don't have an ironing board so I tried to make a make-shift ironing board on my floor using a towel and some books. I grabbed the heaviest books on the nearest shelf, which happened to be Organic Chemistry, Stalingrad and an annotated study bible containing the old and new testament. As I pulled out the bible, a flower fell out which had been there for over 17 years now. I know that because it was put there by my first wife, Daniel, who killed herself in April about 17 years ago. It fell from Thessalonians to which it had been opened partially. Highlighted was the passage: "Ye are all sons of the light and sons of the day." I guess the passage gave her solace. Daniel was a complicated woman. She had serious mental health issues which plagued her for her entire life. None of them were her fault. She was dealt an absolutely awful hand in life, some truly nasty cards. She had some considerable c