I know how to do memoization in Python easily but I need a faster way to compute them, so I am using C++. However, I have no clue how to memoize. I understand that it's about storing values into an array or vector and then scanning for its value when retrieving, but it'd be really helpful to see how this is done so I can try its speed.
Just for fun, here's a little generic memoizer I wrote some time ago. It requires variadic templates, naturally:
template <template <typename...> class Container, typename...> struct Memo;
template <typename R, typename... Args, template <typename...> class Container>
struct Memo<Container, R, std::tuple<Args...>>
{
Memo(std::function<R(Args...)> f) : func(f) { }
R operator()(Args && ...args)
{
const auto arg = std::make_tuple(args...);
typename CacheContainer::const_iterator it = cache.find(arg);
if (it == cache.cend())
{
it = cache.insert(typename CacheContainer::value_type(arg, func(std::forward<Args>(args)...))).first;
std::cout << "New call, memoizing..." << std::endl;
}
else
{
std::cout << "Found it in the cache!" << std::endl;
}
return it->second;
}
private:
typedef Container<typename std::tuple<Args...>, R> CacheContainer;
std::function<R(Args...)> func;
CacheContainer cache;
};
template <typename R, typename... Args>
Memo<std::map, R, std::tuple<Args...>> OMapMemoize(R(&f)(Args...))
{
return Memo<std::map, R, std::tuple<Args...>>(f);
}
template <typename R, typename... Args>
Memo<std::unordered_map, R, std::tuple<Args...>> UMapMemoize(R(&f)(Args...))
{
return Memo<std::unordered_map, R, std::tuple<Args...>>(f);
}
I'm not entirely sure if I got the rvalue-forwardiness right, as it's a long time ago that I wrote this. Anyway, here's a test case:
int foo(double, char) { return 5; }
int main()
{
auto f = OMapMemoize(foo);
auto g = UMapMemoize(foo);
int a, b;
a = f(1.0, 'a');
a = f(1.0, 'a');
a = f(1.0, 'a');
a = f(1.0, 'a');
b = g(1.0, 'a');
b = g(1.0, 'a');
b = g(1.0, 'a');
b = g(1.0, 'a');
return a;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With