Standard functional programming languages like Haskell and SML use polymorphic type systems. These type systems however, are not sound. What this means is that the type-systems themselves can not be used to prove properties of the software in the sense of "total correctness". To see that this is true, we can get a "proof" (read program) of inhabitation of any arbitrary type A, by simply using the program:
data Bot {- Look Ma! No constructors! -}
bot :: Bot
bot = bot
Clearly when we say that bot is an inhabitant of Bot, we dont mean that it actually produces a value of type Bot, since there aren't any as Bot has no constructors! We can easily use this type of proof to prove something like A ∧ ¬ A which leads to a pretty degenerate logic. However, the type system is still *useful* in the sense that if the program ever *does* terminate, it's sure to do so with the appropriate type. This means we can get the full class of Turing complete programs, a very useful benefit.
In Constructive Type Theory, we need to keep stronger guarantees. For programming languages such as Coq, we use syntactic restrictions to ensure that programs terminate (or coterminate). This however has the annoying feature that a lot of programs which are obviously (co)terminating will be rejected simply because of syntax.
Cyclic proofs give a method of describing inductive or coinductive proofs without requiring that we demonstrate the fact that our term (co)terminates up front. We can defer the proof until later. The huge advantage to this is that we can use ideas from supercompilation, as a form of proof normalisation and then show that the syntactic termination criteria are satisfied for the resulting transformed proof, rather than for the original.
As an example, take the following program (with the usual list, cons notation from Haskell):
Now, we can associate with this the following (infered) pre-proof type tree:
Now, we can use the usual super-compilation manipulations on this type tree, including beta-reduction, etc. to arrive at this new tree:
This is actually a proof, rather than a pre-proof as can be verified syntactically. It satisfies the guardedness condition of coinduction, and the structural recursion condition for induction.
From the proof above, we can derive the following program, which is syntactically sound.
This process is basically an extended form of cut-elimination where we can extend the applicability of cut-elimination since we don't directly use induction rules, but instead we use cycles in the proof. We can then work with transformations over a larger class of things which are similar to normalisation.
There are a lot of advantages to this approach. In the first program our function 'sum' did not meet the guardedness condition, which means it would not be admissible in Coq, despite being perfectly correct (as it is in fact productive). Using pre-proofs we can defer proof, which gives us better compositionality. We can even use higher order functions which are not in general correct, on particular functions to derive programs which are totally correct.
In addition, we can decide only to show total correctness for regions of a program, rather than the entire program. We could decide that only certain regions require total correctness, and freely mix total correctness with partial correctness.
There is still a ton of work to be done in this area. It would be nice to know what proof transformation rules coupled with which algorithms can solve various classes of problems. Kamendantskaya has a very interesting class of productive functions which, I believe, could be found using a particular proof transformation algorithm. I'd like to have this algorithm and a proof that it works. In addition, I'd like to have more examples where this can be used to enhance compositionality (I'm thinking of filter functions in particular, where this might come in handy).
Sorry if this blog post is a bit "whirl-wind". I intend to lay out the entire theory in a slower and better motivated way later.
data Bot {- Look Ma! No constructors! -}
bot :: Bot
bot = bot
Clearly when we say that bot is an inhabitant of Bot, we dont mean that it actually produces a value of type Bot, since there aren't any as Bot has no constructors! We can easily use this type of proof to prove something like A ∧ ¬ A which leads to a pretty degenerate logic. However, the type system is still *useful* in the sense that if the program ever *does* terminate, it's sure to do so with the appropriate type. This means we can get the full class of Turing complete programs, a very useful benefit.
In Constructive Type Theory, we need to keep stronger guarantees. For programming languages such as Coq, we use syntactic restrictions to ensure that programs terminate (or coterminate). This however has the annoying feature that a lot of programs which are obviously (co)terminating will be rejected simply because of syntax.
Cyclic proofs give a method of describing inductive or coinductive proofs without requiring that we demonstrate the fact that our term (co)terminates up front. We can defer the proof until later. The huge advantage to this is that we can use ideas from supercompilation, as a form of proof normalisation and then show that the syntactic termination criteria are satisfied for the resulting transformed proof, rather than for the original.
As an example, take the following program (with the usual list, cons notation from Haskell):
codata CoNat = Z | S CoNat data List = [] | (:) CoNat List plus :: CoNat -> CoNat plus Z y = y plus (S x) y = S (plus x y) sum :: List -> CoNat sum [] = Z sum (x:xs) = plus x (sum xs)
Now, we can associate with this the following (infered) pre-proof type tree:
Now, we can use the usual super-compilation manipulations on this type tree, including beta-reduction, etc. to arrive at this new tree:
This is actually a proof, rather than a pre-proof as can be verified syntactically. It satisfies the guardedness condition of coinduction, and the structural recursion condition for induction.
From the proof above, we can derive the following program, which is syntactically sound.
sum [] = Z sum (Z:xs) = sum xs sum (S:xs) = S(f x xs) f Z xs = sum xs f (S x) xs = S(f x xs)
This process is basically an extended form of cut-elimination where we can extend the applicability of cut-elimination since we don't directly use induction rules, but instead we use cycles in the proof. We can then work with transformations over a larger class of things which are similar to normalisation.
There are a lot of advantages to this approach. In the first program our function 'sum' did not meet the guardedness condition, which means it would not be admissible in Coq, despite being perfectly correct (as it is in fact productive). Using pre-proofs we can defer proof, which gives us better compositionality. We can even use higher order functions which are not in general correct, on particular functions to derive programs which are totally correct.
In addition, we can decide only to show total correctness for regions of a program, rather than the entire program. We could decide that only certain regions require total correctness, and freely mix total correctness with partial correctness.
There is still a ton of work to be done in this area. It would be nice to know what proof transformation rules coupled with which algorithms can solve various classes of problems. Kamendantskaya has a very interesting class of productive functions which, I believe, could be found using a particular proof transformation algorithm. I'd like to have this algorithm and a proof that it works. In addition, I'd like to have more examples where this can be used to enhance compositionality (I'm thinking of filter functions in particular, where this might come in handy).
Sorry if this blog post is a bit "whirl-wind". I intend to lay out the entire theory in a slower and better motivated way later.
Comments
Post a Comment