-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement batch processing in mosaic cache invalidation #77
Implement batch processing in mosaic cache invalidation #77
Conversation
This commit introduces a mechanism to handle large sets of mosaic cache keys by processing them in batches. When the set size reaches a predefined maximum (MAX_SET_SIZE), the current batch is processed, and the set is cleared to make room for new entries. This change ensures that the cache invalidation job can handle large volumes of data without running into memory constraints. Additionally, the commit includes a final call to process any remaining keys after iterating through all mosaic tiles. This improvement optimizes the cache invalidation process, making it more efficient and reliable.
WalkthroughThe recent update revolutionizes cache key management with efficient batch processing for large sets, enhancing cache invalidation. By introducing Changes
Poem
Recent Review DetailsConfiguration used: CodeRabbit UI Files selected for processing (1)
Files skipped from review as they are similar to previous changes (1)
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
Move the condition for updating mosaic cache info into the main invalidation function block to ensure it only runs after invalidations are processed. This change improves the logical grouping of operations and maintains the correct sequence of cache invalidation followed by cache info update.
This commit introduces a mechanism to handle large sets of mosaic cache keys by processing them in batches. When the set size reaches a predefined maximum (MAX_SET_SIZE), the current batch is processed, and the set is cleared to make room for new entries. This change ensures that the cache invalidation job can handle large volumes of data without running into memory constraints. Additionally, the commit includes a final call to process any remaining keys after iterating through all mosaic tiles. This improvement optimizes the cache invalidation process, making it more efficient and reliable.
https://kontur.fibery.io/Tasks/Task/prod-raster-tiler-error-in-invalidateMosaicCache-15257
Summary by CodeRabbit
MAX_SET_SIZE
with a value of 1,000,000.