I found an article that contains this code:
template <typename ReturnType, typename... Args>
std::function<ReturnType (Args...)>
memoize(std::function<ReturnType (Args...)> func)
{
std::map<std::tuple<Args...>, ReturnType> cache;
return ([=](Args... args) mutable {
std::tuple<Args...> t(args...);
if (cache.find(t) == cache.end())
cache[t] = func(args...);
return cache[t];
});
}
Can you explain this please? I can't understand many things here, but the weirdest thing is that cache is local and not static, but maybe I'm wrong and...
The right way to do memoization in C++ is to mix the Y-combinator in. Your base function needs a modification. Instead of calling itself directly, it takes a templateized reference to itself as its first argument (or, a std::function<Same_Signature> recursion as its first argument). We start with a Y-combinator.
To overcome this, React provides us with a performance feature known as memoization. Memoization is an optimization technique that allows an increase in the performance of a program by storing the results of some expensive function so that we don't need to call that function when the same inputs are given.
Importance of Memoization: When a function is given in input, it performs the necessary computation and saves the result in a cache before returning the value. If the same input is received again in the future, it will not be necessary to repeat the process. It would simply return the cached answer from the memory.
In programming, memoization is an optimization technique that makes applications more efficient and hence faster. It does this by storing computation results in cache, and retrieving that same information from the cache the next time it's needed instead of computing it again.
This is simple C++1x implementation of memoization.
The memoize
function returns a closure. The return value is a function that has state other than what is passed through the arguments (in this case, the cache
variable).
The [=]
bit in the anonymous function indicates that the returned function should take a copy of all local variables. The cache
variable is not static because it is meant to be shared across invocations of the returned function.
Thus, each call to memoize
will return a different function with it's own cache
. Subsequent calls to a specific closure returned by memoize
will insert/fetch values from that closure's cache
.
You can think of this as a somewhat equivalent to the more old-school OOP version:
template <typename ReturnType, typename... Args>
class Memoize
{
std::map<std::tuple<Args...>, ReturnType> cache;
public:
ReturnType operator() (Args... args)
{
std::tuple<Args...> t(args...);
if (cache.find(t) == cache.end())
cache[t] = func(args...);
return cache[t];
}
};
The cache is embedded into the lambda itself, and local to it.
Therefore, if you create two lambdas, each will have a cache of its own.
It's a great way to implement a simple cache, since this way the memory used is purged as soon as the lambda goes out of scope, and you don't have an explosion of memory.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With