Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to dynamically remove a decorator from a function?

I'd like to activate or deactivate a "cache" in some class method during execution.

I found a way to activate it with something like that:

(...)
setattr(self, "_greedy_function", my_cache_decorator(self._cache)(getattr(self, "_greedy_function")))
(...)

where self._cache is a cache object of my own that stores the results of self._greedy_function.

It's working fine but now what if I want to deactivate the cache and "undecorate" _greedy_function?

I see a possible solution, storing the reference of _greedy_function before decorating it but maybe there is a way to retrieve it from the decorated function and that would be better.

As requested, here are the decorator and the cache object I'm using to cache results of my class functions:

import logging
from collections import OrderedDict, namedtuple
from functools import wraps

logging.basicConfig(
    level=logging.WARNING,
    format='%(asctime)s %(name)s %(levelname)s %(message)s'
)

logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)

CacheInfo = namedtuple("CacheInfo", "hits misses maxsize currsize")

def lru_cache(cache):
    """
    A replacement for functools.lru_cache() build on a custom LRU Class.
    It can cache class methods.
    """
    def decorator(func):
        logger.debug("assigning cache %r to function %s" % (cache, func.__name__))
        @wraps(func)
        def wrapped_func(*args, **kwargs):
            try:
                ret = cache[args]
                logger.debug("cached value returned for function %s" % func.__name__)
                return ret
            except KeyError:
                try:
                    ret = func(*args, **kwargs)
                except:
                    raise
                else:
                    logger.debug("cache updated for function %s" % func.__name__)
                    cache[args] = ret
                    return ret
        return wrapped_func
    return decorator

class LRU(OrderedDict):
    """
    Custom implementation of a LRU cache, build on top of an Ordered dict.
    """
    __slots__ = "_hits", "_misses", "_maxsize"

    def __new__(cls, maxsize=128):
        if maxsize is None:
            return None
        return super().__new__(cls, maxsize=maxsize)

    def __init__(self, maxsize=128, *args, **kwargs):
        self.maxsize = maxsize
        self._hits = 0
        self._misses = 0
        super().__init__(*args, **kwargs)

    def __getitem__(self, key):
        try:
            value = super().__getitem__(key)
        except KeyError:
            self._misses += 1
            raise
        else:
            self.move_to_end(key)
            self._hits += 1
            return value

    def __setitem__(self, key, value):
        super().__setitem__(key, value)
        if len(self) > self._maxsize:
            oldest, = next(iter(self))
            del self[oldest]

    def __delitem__(self, key):
        try:
            super().__delitem__((key,))
        except KeyError:
            pass

    def __repr__(self):
        return "<%s object at %s: %s>" % (self.__class__.__name__, hex(id(self)), self.cache_info())

    def cache_info(self):
        return CacheInfo(self._hits, self._misses, self._maxsize, len(self))

    def clear(self):
        super().clear()
        self._hits, self._misses = 0, 0

    @property
    def maxsize(self):
        return self._maxsize

    @maxsize.setter
    def maxsize(self, maxsize):
        if not isinstance(maxsize, int):
            raise TypeError
        elif maxsize < 2:
            raise ValueError
        elif maxsize & (maxsize - 1) != 0:
            logger.warning("LRU feature performs best when maxsize is a power-of-two, maybe.")
        while maxsize < len(self):
            oldest, = next(iter(self))
            print(oldest)
            del self[oldest]
        self._maxsize = maxsize

Edit: I've updated my code using the __wrapped__ attribute suggested in comments and it's working fine! The whole thing is here: https://gist.github.com/fbparis/b3ddd5673b603b42c880974b23db7cda (kik.set_cache() method...)

like image 702
fbparis Avatar asked Mar 18 '19 00:03

fbparis


People also ask

How do I disable decorators in Python?

You cannot actually skip or disable a decorator, but if it uses the above mentioned @wraps utility, you can access the original function it was applied to via its __wrapped__ property.

What is __ wrapped __ in Python?

wrapped: The function name that is to be decorated by wrapper function. updated : Tuple to specify which attributes of the wrapper function are updated with the corresponding attributes from the original function.

Are decorators Pythonic?

Recall that a decorator is just a regular Python function. All the usual tools for easy reusability are available. Let's move the decorator to its own module that can be used in many other functions. Note: You can name your inner function whatever you want, and a generic name like wrapper() is usually okay.

When would you use a function decorator?

You'll use a decorator when you need to change the behavior of a function without modifying the function itself. A few good examples are when you want to add logging, test performance, perform caching, verify permissions, and so on. You can also use one when you need to run the same code on multiple functions.


2 Answers

You have made things too complicated. The decorator can be simply removed by del self._greedy_function. There's no need for a __wrapped__ attribute.

Here is a minimal implementation of the set_cache and unset_cache methods:

class LRU(OrderedDict):
    def __init__(self, maxsize=128, *args, **kwargs):
        # ...
        self._cache = dict()
        super().__init__(*args, **kwargs)

    def _greedy_function(self):
        time.sleep(1)
        return time.time()

    def set_cache(self):
        self._greedy_function = lru_cache(self._cache)(getattr(self, "_greedy_function"))

    def unset_cache(self):
        del self._greedy_function

Using your decorator lru_cache, here are the results

o = LRU()
o.set_cache()
print('First call', o._greedy_function())
print('Second call',o._greedy_function()) # Here it prints out the cached value
o.unset_cache()
print('Third call', o._greedy_function()) # The cache is not used

Outputs

First call 1552966668.735025
Second call 1552966668.735025
Third call 1552966669.7354007
like image 85
gdlmx Avatar answered Oct 12 '22 16:10

gdlmx


Modern versions of functools.wraps install the original function as an attribute __wrapped__ on the wrappers they create. (One could search through __closure__ on the nested functions typically used for the purpose, but other types could be used as well.) It’s reasonable to expect whatever wrapper to follow this convention.

An alternative is to have a permanent wrapper that can be controlled by a flag, so that it can be enabled and disabled without removing and reinstating it. This has the advantage that the wrapper can keep its state (here, the cached values). The flag can be a separate variable (e.g., another attribute on an object bearing the wrapped function, if any) or can be an attribute on the wrapper itself.

like image 2
Davis Herring Avatar answered Oct 12 '22 17:10

Davis Herring