Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Caching in-memory with a time limit in python

Tags:

python

caching

I have a function that returns a list, say list_x.

def result(val):
    ..
    return(list_x)

I am calling result() every minute and storing the list.

def other_func():
    #called every minute
    new_list = result(val)

I would like to store the value of new_list for an hour (in some sort of in-memory cache may be?) and then update it again, basically call results() after an hour and not every minute.I read about functools.lru_cache but that will not help here I think. Any ideas?

like image 266
user2715898 Avatar asked Jun 14 '18 22:06

user2715898


People also ask

How do I cache memory in Python?

One way to implement an LRU cache in Python is to use a combination of a doubly linked list and a hash map. The head element of the doubly linked list would point to the most recently used entry, and the tail would point to the least recently used entry.

How do I speed up Python cache?

One technique to make the algorithm more efficient is called memoization. Memoization speeds up the algorithm by storing previously calculated results in a cache. Thus, the function only needs to look up the result of a node without running the computation again.

What is Inmemory caching?

An in-memory cache is a data storage layer that sits between applications and databases to deliver responses with high speeds by storing data from earlier requests or copied directly from databases.

Is lru_cache thread safe?

lru_cache decorator. lru_cache isn't ... It looks like a fantastic library that provides great functionality. But note that those classes are not thread-safe - you have to manually ......


2 Answers

The ttl_cache decorator in cachetools==3.1.0 works a lot like functools.lru_cache, but with a time to live.

import cachetools.func

@cachetools.func.ttl_cache(maxsize=128, ttl=10 * 60)
def example_function(key):
    return get_expensively_computed_value(key)


class ExampleClass:
    EXP = 2

    @classmethod
    @cachetools.func.ttl_cache()
    def example_classmethod(cls, i):
        return i* cls.EXP

    @staticmethod
    @cachetools.func.ttl_cache()
    def example_staticmethod(i):
        return i * 3
like image 69
Asclepius Avatar answered Nov 15 '22 22:11

Asclepius


Building a single-element cache with a time-to-live is pretty trivial:

_last_result_time = None
_last_result_value = None
def result(val):
    global _last_result_time
    global _last_result_value
    now = datetime.datetime.now()
    if not _last_result_time or now - _last_result_time > datetime.timedelta(hours=1):
        _last_result_value = <expensive computation here>
        _last_result_time = now
    return _last_result_value

If you want to generalize this as a decorator, it's not much harder:

def cache(ttl=datetime.timedelta(hours=1)):
    def wrap(func):
        time, value = None, None
        @functools.wraps(func)
        def wrapped(*args, **kw):
            nonlocal time
            nonlocal value
            now = datetime.datetime.now()
            if not time or now - time > ttl:
                value = func(*args, **kw)
                time = now
            return value
        return wrapped
    return wrap

If you want it to handle different arguments, storing a time-to-live for each one:

def cache(ttl=datetime.timedelta(hours=1)):
    def wrap(func):
        cache = {}
        @functools.wraps(func)
        def wrapped(*args, **kw):
            now = datetime.datetime.now()
            # see lru_cache for fancier alternatives
            key = tuple(args), frozenset(kw.items()) 
            if key not in cache or now - cache[key][0] > ttl:
                value = func(*args, **kw)
                cache[key] = (now, value)
            return cache[key][1]
        return wrapped
    return wrap

You can of course key adding features to it—give it a max size and evict by time of storage or by LRU or whatever else you want, expose cache stats as attributes on the decorated function, etc. The implementation of lru_cache in the stdlib should help show you how to do most of the trickier things (since it does almost all of them).

like image 43
abarnert Avatar answered Nov 15 '22 22:11

abarnert