Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why call-by-value evaluation strategy is not Turing complete?

I'm reading an article about different evaluation strategies (I linked article in wiki, but I'm reading another one not in English). And it says that unlike to call-by-name and call-by-need strategies, call-by-value strategy is not Turing complete.

Can anybody explain, please, why is it so? If it's possible, add an example pls.

like image 827
Roman Avatar asked May 31 '10 15:05

Roman


1 Answers

I dispute the claim in the article you are reading. (I'm not getting paid for this, so I'm going to provide a suggestive argument, not a proof.)

It is well known that, at least under normal-order reduction (aka call by name), the pure lambda calculus is Turing-complete. But if we look at John Reynolds's seminal paper Definitional Interpreters for Higher-Order Programming Languages, we can see that Reynolds discusses at length the difference between call by name and call by value. A critical part of the argument is that in order to make a proper distinction, we can transform a program into continuation-passing style. The CPS transform is different for call by need and call by value, but the resulting transformed terms can be evaluated in either style.

So here comes the argument: write a lambda-calculus program that simulates a Turing machine, then CPS transform it using the CBN transform, and you can evaluate the resulting code using a CBV reduction strategy. Bang! Turing-complete.

In practice, I bet you can write a CBV program to emulate a Turing machine; it's probably enough to pick a suitable fixed-point combinator, like Θ for example. (The more famous Y combinator works only under a call-by-name reduction strategy, i.e., normal-order reduction.)

Disclaimer: I haven't studied lambda calculus in ages, and I'm sure there are several details wrong in the argument above. But I'm confident in the substance. It wouldn't be the first time I spotted something blatantly wrong in online resources about programming-language theory.

like image 95
Norman Ramsey Avatar answered Nov 03 '22 01:11

Norman Ramsey