Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cachetools memoizing attributes (cache, key, lock) are not available #13395

Open
lucaswerkmeister opened this issue Jan 12, 2025 · 2 comments
Open
Labels
stubs: false positive Type checkers report false errors

Comments

@lucaswerkmeister
Copy link

According to cachetools’ documentation,

The decorator’s cache, key and lock parameters are also available as cache, cache_key and cache_lock attributes of the memoizing wrapper function. […] For the common use case of clearing or invalidating the cache, the decorator also provides a cache_clear()[.]

However, this is currently not reflected in the typeshed stubs. cached() and cachedmethod() are declared to return IdentityFunction, i.e. a decorator that takes the function-to-be-cached and returns a callable with identical signature, without the additional properties and functions. It would be great if support for this was added.

Code example
import cachetools
import threading

## cached function

@cachetools.cached(cache=cachetools.LRUCache(maxsize=10),
                   key=lambda x, y: y,
                   lock=threading.Lock())
def fun(x: str, y: str) -> str:
    return y

print(fun('ignored', 'hello world'))
with fun.cache_lock:
    print(fun.cache[fun.cache_key('ignored', 'hello world')])
fun.cache_clear()

## cached method

class Class:
    def __init__(self):
        self.cache = cachetools.LRUCache(maxsize=10)
        self.lock = threading.Lock()

    @cachetools.cachedmethod(cache=lambda self: self.cache,
                             key=lambda self, x, y: y,
                             lock=lambda self: self.lock)
    def meth(self, x: str, y: str) -> str:
        return y

c = Class()
print(c.meth('ignored', 'hello world'))
with c.meth.cache_lock(c):
    print(c.meth.cache(c)[c.meth.cache_key(c, 'ignored', 'hello world')])

I’m not sure if this is the “intended” way to use the memoizing attributes of a cached method (passing in c for the self argument explicitly), but it seems to work at runtime, at least. (Personally, I’d be fine with support just for @cached functions, but I figured it makes sense to include @cachedmethod in the example too.)

@lucaswerkmeister
Copy link
Author

I tried to implement this myself, but I’m afraid I’m not good enough at Python generics and type magic, sorry 😔 but maybe someone else can figure it out!

@srittau srittau added the stubs: false positive Type checkers report false errors label Jan 13, 2025
@max-muoto
Copy link
Contributor

max-muoto commented Jan 27, 2025

I think the core reasons why this is difficult to do (at least for cachedmethod) are the same as why lru_cache in the stdlib isn't properly typed: #11280 and https://discuss.python.org/t/allow-self-

I think fixing cachedtools.cached wouldn't be too hard, something like this would work for the example you provided:

class _CachedFunc(Generic[_P, _T]):
    def __call__(self, *args: _P.args, **kwargs: _P.kwargs) -> _T: ...
    def cache_clear(self) -> None: ...
    def cache_key(self, *args: _P.args, **kwargs: _P.kwargs) -> _T: ...
    @property
    def cache_lock(self) -> AbstractContextManager[Any]: ...
    cache: Mapping[Any, Any]

def cached(
    cache: MutableMapping[_KT, Any] | None,
    key: Callable[..., _KT] = ...,
    lock: AbstractContextManager[Any] | None = None,
    info: bool = False,
) -> Callable[[Callable[_P, _T]], _CachedFunc[_P, _T]]: ...

(Ignoring the cache type)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stubs: false positive Type checkers report false errors
Projects
None yet
Development

No branches or pull requests

3 participants