Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can fixed-point combinators make recursive constructs terminate?

Fixed-point combinators provide a way for anonymous functions to refer to themselves or to build mutually recursive structures. Although useful in lambda-calculus, they are essentially superfluous in modern programming languages because most if not all support recursion, lambdas and closures.

Also, fixed-point combinators can make recursive constructs like left-recursive grammar parsers terminate. Consider Lickman 1995, who proves termination for his implementation but never actually mentions how it works (it's just a step-by-step derivation from lattice theory to the haskell implementation) and why he needs fixed-point combinators in a language that already supports recursion natively.

How does it work and why does he require a fixed-point combinator?

like image 856
thwd Avatar asked Nov 05 '14 16:11

thwd


2 Answers

From a quick scan, at the end of 5.3, Lickman writes "As defined above, fixS is ensured to been sufficiently productive on all continuous inputs."

The point is to get the fixpoint operator to produce enough output so that parsing can continue. You can't do that for a general fix :: (a -> a) -> a but specializing a to Set a, or later Parser a, gives enough structure (namely that of a lattice) to work with.

Again, I've just looked through the thesis cursorily, but I think the proof (in section 5.5) of the statement "h :: Parser a -> Parser a maintains the property of productivity ==> fixP h is productive" is key.

like image 123
yatima2975 Avatar answered Nov 11 '22 15:11

yatima2975


Sure thing. Here is a simple right-recursive grammar in three rules:

S -> b T
T -> a S
T -> a

These three rules let us build a parser for recognizing these strings:

type Parser = String -> (Bool, String)
s :: Parser
s "" = (False, "")
s (c : cs) = if c == 'b' then t cs else (False, cs)

t :: Parser
t "" = (False, "")
t (c : cs) 
  | c == 'a' && cs == "" = (True, "") 
  | c /= 'a' = (False, cs)
  | otherwise = s cs

If you want to do more general parsing, just specialize the Bool to instead have some data structure, perhaps stored in a Maybe to indicate failure. Returning (False, ___) on a failed parse would help if S had some other rules too, like e.g. S -> T T and T -> b b. Then when we get a 'b' followed by (False, ___) we rewind to try S -> T T. These sorts of grammars can be done with a bit of elbow-grease and recursion.

The three rules above will successfully match strings like "ba", "baba", and so on. We could also write these strings left-recursively as:

S -> T a
T -> S b
T -> b

What happens if you try to write the same parsers above? An infinite loop, if you're looking at the front of the string. The problem is that the function S will call the function T first thing, and then T will call S first thing, and they will mutually recurse ad infinitum. The computer's not smart enough to know that the postconditions ("followed by an a", "followed by a b") make further solution impossible; it just descends into your functions and trusts that you know what you're doing.

How does a good fixed-point combinator help? Well, think of these rules as describing a tree: then function evaluation traverses that tree depth-first, and this particular tree is infinite in that direction. A breadth-first traversal, on the other hand, can be based on these rules and can pick up the result which uses the fewest of these functions possible, and that is the 'least fixed point' for a certain function based on this grammar. So that's why the right fixed-point combinator (based either on the diag in the paper or the lattice-theory combinator) can terminate when describing these rules, while naive recursion will not.

like image 3
CR Drost Avatar answered Nov 11 '22 16:11

CR Drost