Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to ignore a parameter in functools. lru_cache?

Tags:

python-3.x

This is a skeleton of the function I want to enhance with a cache, because doing RPC (remote procedure call) involves a TCP connection to other host.

def rpc(rpc_server, rpc_func, arg):
    return rpc_server.do_rpc(rpc_func, arg)

However, the most convenient way of simply decorating it with:

@functools.lru_cache()

does not work well, beacuse rpc_server objects come and go and this parameter should be ignored by the cache.

I can write a simple memoizing code myself. No problem with that. Actually, I see no other solution.

I am unable to rewrite this function in such way that @lru_cache() decorator can be applied and rpc_server will be passed as an argument (i.e. I don't want to make rpc_server a global variable). Is it possible?

like image 245
VPfB Avatar asked Aug 14 '16 09:08

VPfB


People also ask

What does lru_cache none do?

Python's @lru_cache decorator offers a maxsize attribute that defines the maximum number of entries before the cache starts evicting old items. By default, maxsize is set to 128 . If you set maxsize to None , then the cache will grow indefinitely, and no entries will be ever evicted.

How do you use Functools in Python?

Functools module is for higher-order functions that work on other functions. It provides functions for working with other functions and callable objects to use or extend them without completely rewriting them. This module has two classes – partial and partialmethod.

What is cached property in Python?

cached_property is a decorator that converts a class method into a property whose value is calculated once and then cached like a regular attribute. The cached value will be available until the object or the instance of the class is destroyed.

What is LRU cache Python?

An LRU cache stands for Least Recently Used cache. It is an efficient cache data structure that can be used to figure out what we should evict when the cache is full. It has the following features: Fixed-size cache capacity. Items are organized in order of from most-recently-used to least-recently-used.


1 Answers

I'm posting this just for completness. Comments are welcome, but please do not vote.


I have found a way how to satisfy the conditions from my question. I'm not going to use this code. But it shows how flexible Python is.

import functools

class BlackBox:
    """All BlackBoxes are the same."""
    def __init__(self, contents):
        # TODO: use a weak reference for contents
        self._contents = contents

    @property
    def contents(self):
        return self._contents

    def __eq__(self, other):
        return isinstance(other, type(self))

    def __hash__(self):
        return hash(type(self))

@functools.lru_cache()
def _cached_func(blackbox, real_arg):
    print("called with args:", blackbox.contents, real_arg)
    return real_arg + 1000

def cached_func(ignored_arg, real_arg):
    # ignored means ignored by the cache
    return _cached_func(BlackBox(ignored_arg), real_arg)

cached_func("foo", 1) # cache miss
cached_func("bar", 1) # cache hit

cached_func("bar", 2) # cache miss
cached_func("foo", 2) # cache hit
like image 124
VPfB Avatar answered Oct 27 '22 18:10

VPfB