Thanks to my brother I got a really cool book on proof theory called "Basic Proof Theory". It has a bunch of nice features including a from the ground up presentation of proof theory that should be relatively accesible to anyone with a background in mathematics. It demonstrates some of the connections provided by the Curry-Howard correspondance (which is my favourite part of the book) . It also describe Second order logic, which is great because I've had very little formal exposure to this. Second order logic is really beautiful since you can define all the connectives in terms of ∀, ∀2 and →. If you pun ∀ and ∀2 you have a really compact notation.
The book also forced me to learn some things I hadn't wrapped my head around. One of those was Gentzen style sequent calculus. This really turns out to be pretty easy when you have a good book describing it. I've even wrote a little sequent solver (in lisp) since I found the proofs so much fun. The first order intuisionistic sequent solver is really not terribly difficult to write. Basically I treat the proofs as goal directed starting with a sequent of the form:
⇒ F
And try to arive at leaves of the tree that all have the form:
A ⇒ A
I have already proven that 'F ⇒ F' for compound formulas F from 'A ⇒ A' so I didn't figure it was neccessary to make the solver do it. The solver currently only works with propositional formula (it solves a type theory where types are not parameteric.) but I'm interested in limited extensions though I haven't thought much about that. I imagine I quickly get something undecidable if I'm not careful.
Anyhow working with the sequent calculus got me thinking about → In the book they present the rule for R→ as such
This is a bit weird since there is nothing that goes the other direction. ie. for non of: Minimal, Intuisionistic or Classical logic do you find a rule in which you introduce a connective in the left from formulas in the right. I started looking around for something that does this and I ran into Basic Logic. I haven't read the paper yet so I can't really comment on it. I'll let you know after I'm done with it.
The book also forced me to learn some things I hadn't wrapped my head around. One of those was Gentzen style sequent calculus. This really turns out to be pretty easy when you have a good book describing it. I've even wrote a little sequent solver (in lisp) since I found the proofs so much fun. The first order intuisionistic sequent solver is really not terribly difficult to write. Basically I treat the proofs as goal directed starting with a sequent of the form:
⇒ F
And try to arive at leaves of the tree that all have the form:
A ⇒ A
I have already proven that 'F ⇒ F' for compound formulas F from 'A ⇒ A' so I didn't figure it was neccessary to make the solver do it. The solver currently only works with propositional formula (it solves a type theory where types are not parameteric.) but I'm interested in limited extensions though I haven't thought much about that. I imagine I quickly get something undecidable if I'm not careful.
Anyhow working with the sequent calculus got me thinking about → In the book they present the rule for R→ as such
Γ,A ⇒ Δ,B Γ ⇒ A→B,Δ |
This is a bit weird since there is nothing that goes the other direction. ie. for non of: Minimal, Intuisionistic or Classical logic do you find a rule in which you introduce a connective in the left from formulas in the right. I started looking around for something that does this and I ran into Basic Logic. I haven't read the paper yet so I can't really comment on it. I'll let you know after I'm done with it.
Comments
Post a Comment