When a feature is expensive or slow to compute, you may wish to cache its value. Chalk uses the terminology "maximum staleness" to describe how recently a feature value needs to have been computed to be returned without re-running a resolver.
https://docs.chalk.ai/docs/feature-caching
Cache feature values rather than computing them realtime.
@features
class User:
fico_score: int = feature(max_staleness="30d")
https://docs.chalk.ai/docs/feature-caching
Cache the last computed example of the feature.
@features
class User:
fico_score: int = feature(max_staleness="infinity")
https://docs.chalk.ai/docs/feature-caching
Cache intermediate feature values.
ChalkClient().query(
input={ ... },
# User.fico_score is not requested in the output...
output=[User.risk_score],
# ...but you can specify the staleness anyhow!
staleness={User.fico_score: "10m"},
)
https://docs.chalk.ai/docs/query-caching
Set max-staleness per-request.
@features
class User:
fico_score: int = feature(max_staleness="30d")
ChalkClient().query(
input={...},
output=[User.fico_score],
staleness={User.fico_score: "10m"},
)
https://docs.chalk.ai/docs/query-caching
Supply a feature value in the input to skip the cache and any resolver entirely.
@features
class User:
fico_score: int = feature(max_staleness="30d")
ChalkClient().query(
input={User.fico_score: 1, ...},
output=[...],
)
https://docs.chalk.ai/docs/query-caching
Bypass the cache with a max-staleness of 0.
ChalkClient().query(
input={...},
output=[User.fico_score],
staleness={User.fico_score: "0s"},
)
https://docs.chalk.ai/docs/query-caching#cache-busting
Keep the cache warm by scheduling a resolver to run more frequently than the max-staleness.
@features
class User:
fico_score: int = feature(max_staleness="30d")
@realtime(cron="29d 11h")
def get_fico_score(name: User.name) -> User.fico_score:
return requests.get("https://experian.com").json()["score"]