hephex / asyncache Goto Github PK
View Code? Open in Web Editor NEWHelpers to use cachetools with async functions
License: MIT License
Helpers to use cachetools with async functions
License: MIT License
Coro cached instead its result
Traceback (most recent call last):
File "runner.py", line 4, in <module>
asyncio.run(test.main())
File "/usr/lib/python3.8/asyncio/runners.py", line 43, in run
return loop.run_until_complete(main)
File "/usr/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
return future.result()
File "test.py", line 11, in main
await test(0)
RuntimeError: cannot reuse already awaited coroutine
test in attachments
python setup.py build_ext --inplace
python runner.py
I assume the dependency on mypy was accidental (and possibly a side effect of moving to poetry). Previous to recent updates (before 0.2.0
), the only dependency was on cachetools
(which is expected/reasonable).
I only noticed this because I was recreating my environment, where my current mypy (0.990
) conflicts with your stated dependency constraint of ^0.982
. I'm not a poetry user, but my quick reading (and actual error) both indicate that the constraint effectively locks mypy to that specific version (since mypy doesn't utilize 'patch' releases):
ERROR: Cannot install asyncache==0.3.0 and mypy==0.990 because these package versions have conflicting dependencies.
The conflict is caused by:
The user requested mypy==0.990
asyncache 0.3.0 depends on mypy<0.983 and >=0.982
It would be great if that dependency was removed (or if it's needed for some non-obvious reason, relax the constraint).
Hello!
I've found a small discrepancy between cachetools.cahcmethod
and asyncache.cachemethod
behavior in regards of default key
function. Perhaps it makes sense to use keys.methodkey
instead of keys.hashkey
there to exclude self
being used for cache keys?
This would neglect the requirement of a class instance to be hashable and overall just looks like a better way of caching instance method calls.
I don't have access to create PRs myself, here is the change snippet:
--- a/asyncache/__init__.py
+++ b/asyncache/__init__.py
@@ -117,7 +117,7 @@ def cachedmethod(
cache: Callable[[Any], Optional[MutableMapping[_KT, Any]]],
# ignoring the mypy error to be consistent with the type used
# in https://github.com/python/typeshed/tree/master/stubs/cachetools
- key: Callable[..., _KT] = keys.hashkey, # type:ignore
+ key: Callable[..., _KT] = keys.methodkey, # type:ignore
lock: Optional[Callable[[Any], "AbstractContextManager[Any]"]] = None,
Is there a way to clear the cache when using @cached
from this package, like cache_clear()
in the cachetools?
Hi, reading your code I saw that you have the lock
defined as AbstractContextManager
def cached(
cache: Optional[MutableMapping[_KT, Any]],
# ignoring the mypy error to be consistent with the type used
# in https://github.com/python/typeshed/tree/master/stubs/cachetools
key: Callable[..., _KT] = keys.hashkey, # type:ignore
lock: Optional["AbstractContextManager[Any]"] = None,
) -> IdentityFunction:
I think it should be AbstractAsyncContextManager
because we can't do async with lock
with something that is not async.
The library does not seem like access to the cache awaits pending future like this library does: https://github.com/aio-libs/async-lru/blob/ae252508f9c5aecf9c02ddeb879d06c28dbffc42/async_lru.py#L150
This means that when I try to call my function multiple times, subsequent runs are not awaiting the prior future.
The issue seems to be here where the future should be inserted into the dict before awaiting:
asyncache/asyncache/__init__.py
Lines 57 to 61 in 27ec0ad
Add that this has been published to pypi: https://pypi.org/project/asyncache/
And that you can pip install asyncache
Would you like to add type hints to asyncache? I could help
I find myself using cachetools and async code more and more, and had to write my own fork of cachetool to get the behavior I need in my project ( see tkem/cachetools#234 )
As pointed out, asyncache works great with cachetools, which I'd like to use instead of my own fork. One functionality that we rely on however is cachetool's @cachedmethod decorator (essentially, grabbing the cache instance from the decorated method's class instead of a global cache).
Would a PR adding a @cachedmethod decorator be welcome? Plus tests, obviously. If it would, I am happy to work on it (most of the code already exists on my drive).
Thanks in any case!
How is one supposed to safely evict entries?
I'm going off of the cachetools
documentation here and thought that to clear the cache I would need to do something like documentation in this place, eg:
# always use the key function for accessing cache items
with get_pep.cache_lock:
get_pep.cache.pop(get_pep.cache_key(42), None)
However when using asyncache
I don't appear to see any of the usual methods like cache_lock
, cache_clear
, cache_info
etc?
How does one get access to those methods in the async wrapper?
ps. Thanks for the great library.
In cachetools
we can get the cache
from a @cached
decorated function, i.e. in order to clear the cache:
@cached
def some_func():
pass
some_func.cache.clear()
@cached
of asyncache
doesn't yet seem to expose the cache
or am I overlooking something?
Hi,
could you please add py.typed
file to the package? Otherwise mypy doesn't know it should use type annotations.
https://mypy.readthedocs.io/en/stable/installed_packages.html#creating-pep-561-compatible-packages
Cheers.
It looks like you've added a py.typed
file here on GitHub, but not released a new version incorporating the changes. Could you release a new version and upload to PyPi?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.