I'm using @functools.lru_cache
in Python 3.3. I would like to save the cache to a file, in order to restore it when the program will be restarted. How could I do?
Edit 1 Possible solution: We need to pickle any sort of callable
Problem pickling __closure__
:
_pickle.PicklingError: Can't pickle <class 'cell'>: attribute lookup builtins.cell failed
If I try to restore the function without it, I get:
TypeError: arg 5 (closure) must be tuple
Python's functools module comes with the @lru_cache decorator, which gives you the ability to cache the result of your functions using the Least Recently Used (LRU) strategy. This is a simple yet powerful technique that you can use to leverage the power of caching in your code.
To memoize a function in Python, we can use a utility supplied in Python's standard library—the functools. lru_cache decorator. Now, every time you run the decorated function, lru_cache will check for a cached result for the inputs provided. If the result is in the cache, lru_cache will return it.
The functools module, part of Python's standard Library, provides useful features that make it easier to work with high order functions (a function that returns a function or takes another function as an argument ).
Caching is an important concept to understand for every Python programmer. In a nutshell, the concept of caching revolves around utilising programming techniques to store data in a temporary location instead of retrieving it from the source each time.
You can't do what you want using lru_cache
, since it doesn't provide an API to access the cache, and it might be rewritten in C in future releases. If you really want to save the cache you have to use a different solution that gives you access to the cache.
It's simple enough to write a cache yourself. For example:
from functools import wraps def cached(func): func.cache = {} @wraps(func) def wrapper(*args): try: return func.cache[args] except KeyError: func.cache[args] = result = func(*args) return result return wrapper
You can then apply it as a decorator:
>>> @cached ... def fibonacci(n): ... if n < 2: ... return n ... return fibonacci(n-1) + fibonacci(n-2) ... >>> fibonacci(100) 354224848179261915075L
And retrieve the cache
:
>>> fibonacci.cache {(32,): 2178309, (23,): 28657, ... }
You can then pickle/unpickle the cache as you please and load it with:
fibonacci.cache = pickle.load(cache_file_object)
I found a feature request in python's issue tracker to add dumps/loads to lru_cache
, but it wasn't accepted/implemented. Maybe in the future it will be possible to have built-in support for these operations via lru_cache
.
You can use a library of mine, mezmorize
import random from mezmorize import Cache cache = Cache(CACHE_TYPE='filesystem', CACHE_DIR='cache') @cache.memoize() def add(a, b): return a + b + random.randrange(0, 1000) >>> add(2, 5) 727 >>> add(2, 5) 727
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With