Skip to main content

Posts

Showing posts from November, 2006

Call by Name (CBN) is dual to Call By Value (CBV)

Probably one of the best papers I've read on the relationship between CBN, CBV and the Curry-Howard correspondance is the paper Call-by-value is dual to call-by-name by Wadler. The calculus he develops for describing the relationship shows an obvious schematic duality that is very visually appealing. After reading the paper that I mentioned earlier on Socially Responsive, Environmentally Friendly Logic (which shall henceforth be called SREF Logic), it struck me that it would be interesting to see what a CPS ( Continuation-passing Style ) like construction looks like in SREF logic, so I went back to the Wadler paper to see if I could figure out how to mimic the technique for multi-player logic. It looks like the formulation by Wadler comes out directly by thinking about logic as a two player game! I'm excited to see what happens with n-player logic. This has been a big diversion from what I'm actually suppose to be working on but I didn't want to forget about i

Internet to the rescue

Once again the internet comes to the rescue with a Systematic Search for Lambda Expressions .  This is the answer to yesterdays question of whether we can iterate over isomorphic proofs exhaustively in order to extract all programs of a specification which in this case is realised as a "type".  Hooray for computer science!

Brain Dump

I have a bunch of ideas that I don't have time to follow up on but I'd like to retain in some fashion so here goes. 1. A long time ago I had a question about the shortest assembly program (parameterised by the instruction set of course!) that could encode the generators of the quaternions under multiplication. It is a very easy thing to program this, but as I was doing it, I found myself being highly doubtful that I was choosing the "right" encoding. In fact I almost certainly wasn't. Would there not be some very tight encoding in terms of "normal" assembly language operators? This has been a persistent question in my mind that comes up frequently in different and often more general forms. If you want to make a fast compiler, how do you know what instructions to output? If you have a highlevel specification, how can you avoid the ridiculously high overhead usually associated with extremely high level languages? Today I found this fabulous pap

On partial evaluation (and other meta-compilers)

So after a bit of research it turns out that the big reason that we don't have a meta-compiler compiler generator project because sophisticated partial evaluators are pretty much never self applicable.  As far as I can tell from the literature the Futamura projections in the context of logic programming have only ever been applied in practice on "off-line" partial evaluators.  Off-line partial evaluators tend to be much less sophisticated since they do most of their reasoning "off-line", that is, without attempting to unfold. Apparently part of the reason for this is the use of extra-logical features in the sophisticated partial evaluators, in order to make them fast enough that they can reasonably be applied.  It is hard to make performant prolog without using cut.  Once cut is used however, all bets are off, since it is nearly impossible to reason about effectively. I started writing my meta-compiler without using cut, but restricting myself to soft-cut, becau

Plotkin, the LGG and the MGU

Legend has it that a million years ago Plotkin was talking to his professor Popplestone, who said that unification (finding the most general unifier or the MGU) might have an interesting dual, and that Plotkin should find it. It turns out that the dual *is* interesting and it is known as the Least General Generalisation (LGG). Plotkin apparently described both the LGG for terms, and for clauses. I say apparently because I can't find his paper on-line. The LGG for clauses is more complicated so we'll get back to it after we look at the LGG of terms. We can see how the MGU is related to the LGG by looking at a couple of examples and the above image. We use the prolog convention that function symbols start with lower case, and variables start with uppercase. The image above is organised as a DAG (Directed Acyclic Graph). DAGs are a very important structure in mathematics since DAGs are lattices . Essentially what we have done is drawn an (incomplete) Hasse diagr

More Questions than Answers

Code transformation or meta-compilation as it is sometimes called (which is the general notion of techniques including Partial Evaluation, Supercompilation, Deforestation, or my advisors Distillation), is a powerful technique in computer programming. The benefits (and drawbacks) are almost certainly not sufficiently studied. I was just conversing with my room-mate Tom about meta-compilation and I made the supposition that Meta-compilers are somewhat like the technology of the lathe . There are a huge number of technologies that require a lathe in order to be produced efficiently. A lathe can be viewed as a major nexus in the dependency graph of machining technology. A lathe is an almost a completely fixed precondition for the mill. The Mill is the crux of modern machining. It allows you to construct almost any currently available machined part. Without the mill we really wouldn't have the industrial age at all. Do such things exist in computer programming? Metacompiler