Have I uncovered an actual error in the SICP book? It says:
Exercise 3.27: Memoization (also called tabulation) is a technique that enables a procedure to record, in a local table, values that have previously been computed. This technique can make a vast difference in the performance of a program. A memoized procedure maintains a table in which values of previous calls are stored using as keys the arguments that produced the values. When the memoized procedure is asked to compute a value, it first checks the table to see if the value is already there and, if so, just returns that value. Otherwise, it computes the new value in the ordinary way and stores this in the table. As an example of memoization, recall from 1.2.2 the exponential process for computing Fibonacci numbers:
(define (fib n)
(cond ((= n 0) 0)
((= n 1) 1)
(else (+ (fib (- n 1))
(fib (- n 2))))))
The memoized version of the same procedure is
(define memo-fib
(memoize
(lambda (n)
(cond ((= n 0) 0)
((= n 1) 1)
(else
(+ (memo-fib (- n 1))
(memo-fib (- n 2))))))))
where the memoizer is defined as
(define (memoize f)
(let ((table (make-table)))
(lambda (x)
(let ((previously-computed-result
(lookup x table)))
(or previously-computed-result
(let ((result (f x)))
(insert! x result table)
result))))))
and then it says
Explain why memo-fib computes the nth Fibonacci number in a number of steps proportional to N.
The insert!
and lookup
procedures are defined in the book as follows:
(define (lookup key table)
(let ((record (assoc key (cdr table))))
(if record
(cdr record)
false)))
(define (assoc key records)
(cond ((null? records) false)
((equal? key (caar records))
(car records))
(else (assoc key (cdr records)))))
(define (insert! key value table)
(let ((record (assoc key (cdr table))))
(if record
(set-cdr! record value)
(set-cdr! table
(cons (cons key value)
(cdr table)))))
'ok)
Now, assoc
has number of steps proportional to n
. And since lookup
and insert!
use assoc
, they both have number of steps proportional to N.
I do not understand how memo-fib
has a number of steps proportional to N. My observations are:
memo-fib
(the lambda which has n
as the formal parameter), the table would have mostly ordered keys, And the keys would be looked up in an ordered way. So it is safe to assume any call to lookup would be close to a constant time operation.Insert!
on the other hand will not be aware that the keys would be added in some order. If a value does not exist in the table, insert!
will always scan the whole list, so it would have number of steps proportional to n
every time.n-1
elements in the table and we wish to compute (memo-fib n)
, it would have number of steps proportional to n
due to the assoc
in insert!
.(memo-fib n)
would have number of steps proportional to n2 due to insert!
being called every recursive call to memo-fib
.If lookup
and insert!
are constant then it would make sense for memo-fib
to have number of steps proportional to n. But the real number of steps looks like n * (n-k) where k is the number of keys already in the table.
Am I doing it wrong? What am I missing?
Looks like you were right. It does run at about quadratic "complexity", empirically. The assoc
in insert!
is not needed at all; removing it doesn't change the return value and only makes it run much much faster.
To make the tests cleaner I've changed the memoization to not share the table between the calls.
#lang r5rs
(#%require srfi/19)
(define false #f)
(define true #t)
(define (memoize f)
(let ((table (make-table)))
(lambda (x)
(let ((previously-computed-result
(lookup x table)))
(or previously-computed-result
(let ((result (f x)))
(insert! x result table)
result))))))
(define (lookup key table)
(let ((record (assoc key (cdr table))))
(if record
(cdr record)
false)))
(define (assoc key records)
(cond ((null? records) false)
((equal? key (caar records))
(car records))
(else (assoc key (cdr records)))))
(define (insert! key value table)
(let ((record #f ; NB
; (assoc key (cdr table)) ; NB
))
(if record
(set-cdr! record value)
(set-cdr! table
(cons (cons key value)
(cdr table)))))
'ok)
(define (make-table)
(list '*table*))
(define memo-fib
(lambda (n)
(letrec ((mf (memoize ; NB
(lambda (n)
(cond ((= n 0) 0)
((= n 1) 1)
(else
(+ (mf (- n 1))
(mf (- n 2)))))))))
(mf n))))
(define (tt n)
(let* ((t1 (current-time))
(f (memo-fib n))
(t2 (current-time))
(td (time-difference t2 t1))
(n (time-nanosecond td)))
(/
(+ (* (time-second td) 1000000000)
n)
1000000.0))) ; time in milliseconds
; > (memo-fib 100)
; 354224848179261915075
(define (tt2 n1 n2)
(let* ((t1 (tt n1))
(t2 (tt n2)))
(values t1 t2
(cond ((> t1 0)
(/ (log (/ t2 t1)) (log (/ n2 n1))))))))
The testing is done in a very rudimentary fashion. Times are in milliseconds.
; with the lookup in insert!:
; n1 n2 t1 t2 a in t ~ n^a, empirically
; > (tt2 2000 3000) ;=> 90.0 200.0 1.96936
; > (tt2 2000 3000) ;=> 100.0 220.0 1.94457
; > (tt2 2000 3000) ;=> 90.0 210.0 2.08969
; without the lookup: 80,000 takes under 1 second
; but run times are wildly erratic
So it indeed looks like an oversight by the authors, their use of the general insert!
procedure where in fact we know we only insert new entries into the table -- because we memoize the function in the first place!
So, insert!
should be replaced by insert-new!
:
(define (memoize f)
(let ((table (make-table)))
(lambda (x)
(let ((previously-computed-result
(lookup x table)))
(or previously-computed-result
(let ((result (f x)))
(insert-new! x result table)
result))))))
(define (insert-new! key value table)
(set-cdr! table
(cons (cons key value)
(cdr table)))
'ok)
and then is should become linear.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With