I want to cache a function that takes a list as a parameter, but when I try to do so with the functools.lru_cache
decorator, it fails with TypeError: unhashable type: 'list'
.
import functools
@functools.lru_cache()
def example_func(lst):
return sum(lst) + max(lst) + min(lst)
print(example_func([1, 2]))
This fails because a list is unhashable. This would make it hard for Python to know what values are cached. A way to fix this is by converting lists to tuples before passing them to a cached function: since tuples are immutable and hashable, they can be cached.
Use a tuple instead of a list:
>>> @lru_cache(maxsize=2)
... def my_function(args):
... pass
...
>>> my_function([1,2,3])
Traceback (most recent call last):
File "<input>", line 1, in <module>
my_function([1,2,3])
TypeError: unhashable type: 'list'
>>> # TO FIX: use a tuple
>>> my_function(tuple([1,2,3]))
>>>
It should not throw an error, rather convert into hash-able form within decorator without user even knowing it. You can fix this problem by decorating your functions like this:
#Custom Decorator function
def listToTuple(function):
def wrapper(*args):
args = [tuple(x) if type(x) == list else x for x in args]
result = function(*args)
result = tuple(result) if type(result) == list else result
return result
return wrapper
#your cached function
@listToTuple
@lru_cache(maxsize=cacheMaxSize)
def checkIfAdminAcquired(self, adminId) -> list:
query = "SELECT id FROM public.admins WHERE id IN ({}) and
confirmed_at IS NOT NULL"
response = self.handleQuery(query, "int", adminId)
return response
You might want to use yet another decorator after lru_cache to make sure that output of the function is not a tuple, but a list, since right now it will return tuple.
Sometimes a parameter can take either a simple hashable type, or a complicated unhashable type without a straightforward conversion to be hashable, as the current answers propose. In this situation it may still be desirable to have a cache used for the (possibly more common) case of hashable type without using a cache or erroring out in the unhashable case - simply calling the underlying function.
This ignores the error and works generally for any hashable type:
import functools
def ignore_unhashable(func):
uncached = func.__wrapped__
attributes = functools.WRAPPER_ASSIGNMENTS + ('cache_info', 'cache_clear')
@functools.wraps(func, assigned=attributes)
def wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except TypeError as error:
if 'unhashable type' in str(error):
return uncached(*args, **kwargs)
raise
wrapper.__uncached__ = uncached
return wrapper
Usage and testing:
@ignore_unhashable
@functools.lru_cache()
def example_func(lst):
return sum(lst) + max(lst) + min(lst)
example_func([1, 2]) # 6
example_func.cache_info()
# CacheInfo(hits=0, misses=0, maxsize=128, currsize=0)
example_func((1, 2)) # 6
example_func.cache_info()
# CacheInfo(hits=0, misses=1, maxsize=128, currsize=1)
example_func((1, 2)) # 6
example_func.cache_info()
# CacheInfo(hits=1, misses=1, maxsize=128, currsize=1)
Took me a moment to wrap my head around it, but example_func.__wrapped__
is the lru_cache's version and example_func.__uncached__
is the original version.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With