-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache member functions? #86
Comments
There’s some context in #16 . The main complication being that the macros expand to a function definition and a static “once_cell” at the same level, and the static def of the once_cell can’t exist inside of an impl block. There would be some other caveats too like the type of “self” and “&self” would have to impl hash+eq+clone or you would always need to specify the “convert” argument to the macro so it doesn’t use “self” as part of the cache key |
Would there be a way to tackle the issues you describe with an attribute macro? Like, you annotate your struct "Hey, we gotta get some caching in here" and then generate cache calling functions for that cache? Consider: #[derive(cached::Cacheable)]
// or #[cached::make_cachable] not sure which one you'd need
// generating the static once cell here for Foo
struct Foo(i32);
impl Foo {
#[cached]
pub fn get(&self) -> i32 {
self.0
}
} to generate a |
That's probably doable with the (similar) caveat that it would only work for structs defined at the top level so the cache can be defined. Another option that was touched on in #16 is to allow the cached macro to take a callable (probably by string like the "convert" argument) which return the cache that should be used so that you can annotate a method as cached without requiring a global static cache to be defined at the same time |
Hi, #[derive(Cacheable)]
struct Foo {
x: i32
}
impl Foo {
#[cached_method]
pub fn multiply(&self, y: i32) -> i32 {
self.x * y
}
} The expanded code would look something like this: struct Foo {
x: i32,
CACHES: HashMap<String, CacheType>,
}
impl Foo {
pub fn multiply(&self, y: i32) -> i32 {
if let Some(cache) = CACHES.get("MULTIPLY") {
if let Some(value) = cache.get((self, y)) {
return value;
}
} else {
CACHES.insert("MULTIPLY", CacheType::new());
}
result = self.x * y;
CACHES.get_mut("MULTIPLY").unwrap().set((self, y), result);
result
}
} Caveats:
The issue with the new member and data duplication could be solved with a global/static HashMap ( And yes, I also have a use-case where I want to cache the results of methods, but after writing all this and thinking about it, I will probably make a custom implementation with private cache-members, or rewrite my methods to functions which take |
Hi.
I don't know why this isn't possible yet, and while searching the issues I did not find something that answered my questions: Why is there no caching for member functions?
Consider:
That's actually not that hard to cache, is it? Maybe I'm missing some important pieces here, I don't know...
That being said, one could easily craft a type which just calls (cached) free private functions, like so:
right? That'd yield the same, AFAICS?
The text was updated successfully, but these errors were encountered: