Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Name that technique (it may be called 'piggybacking')

What is the name of the following method/technique (I'll try to describe the best I could, background on "memoization" is probably needed to understand why this technique can be very useful):

You start some potentially lenghty asynchronous computation and you realize that an identical computation has already been started but is not done yet and you "piggyback" on the first computation. Then when the first computation ends, it issues not one but two callbacks.

The goal is to not needlessly start a second computation because you know that there's already an identical computation running.

Note that altough not entirely dissimilar, I'm not looking for the particular case of caching that "memoization" is: memoization is when you start a computation and find a cached (memoized) result of that same computation that is already done that you can reuse.

Here I'm looking for the name of the technique that is in a way a bit similar to memoization (in that it is can be useful for some of the same reasons that memoization is a useful technique), except that it reuses the result of the first computation even if the first computation is not done yet at the time you issue the second computation.

I've always called that technique "piggybacking" but I don't know if this is correct.

I've actually used this more than once as some kind of "memoization on steroids" and it came very handy.

I just don't know what the name of this (advanced ?) technique is.

EDIT

Damn, I wanted to comment on epatel's answer but it disappeared. epatel's answer gave me an idea, this technique could be called "lazy memoization" :)

like image 485
cocotwo Avatar asked Mar 04 '10 23:03

cocotwo


4 Answers

This is just memoization of futures.

Normal "eager" memoization works like this:

f_memo(x):
  critical_section:
    if (exists answers(f,x))
      return answers(f,x)
    else
      a = f(x)
      answers(f,x) = a
      return a

Now if f(x) returns futures instead of actual results, the above code works as is. You get the piggyback effect, i.e. like this:

  1. First thread calls f(3)
  2. There is no stored answer for f(3), so in the critical section there's a call to f(3). f(3) is implemented as returning a future, so the 'answer' is ready immediately; 'a' in the code above is set to the future F and the future F is stored in the answers table
  3. The future F is returned as the "result" of the call f(3), which is potentially still ongoing
  4. Another thread calls f(3)
  5. The future F is found from the table, and returned immediately
  6. Now both threads have handle to the result of the computation; when they try to read it, they block until the computation is ready---in the post this communication mechanism was mentioned as being implemented by a callback, presumeably in a context where futures are less common
like image 196
Antti Huima Avatar answered Oct 08 '22 17:10

Antti Huima


Sounds like a future: http://en.wikipedia.org/wiki/Future_%28programming%29

like image 33
ergosys Avatar answered Oct 08 '22 16:10

ergosys


In some contexts, I've heard this called "Request Merging".

like image 2
caf Avatar answered Oct 08 '22 16:10

caf


Sounds a little like Lazy Evaluation, but not exactly...

like image 1
epatel Avatar answered Oct 08 '22 17:10

epatel