-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[timeseries] Fix Time Series Query Correctness Issue #14251
base: master
Are you sure you want to change the base?
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #14251 +/- ##
============================================
+ Coverage 61.75% 63.76% +2.00%
- Complexity 207 1536 +1329
============================================
Files 2436 2626 +190
Lines 133233 144658 +11425
Branches 20636 22139 +1503
============================================
+ Hits 82274 92234 +9960
- Misses 44911 45621 +710
- Partials 6048 6803 +755
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
c1acbff
to
e581709
Compare
} | ||
int[] timeValueIndexes = getTimeValueIndex(timeValues, _storedTimeUnit); | ||
Object[][] tagValues = new Object[_groupByExpressions.size()][]; | ||
ValueBlock valueBlock; | ||
Map<Long, BaseTimeSeriesBuilder> seriesBuilderMap = new HashMap<>(1024); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we increase the limit of 1k series Builder Map?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is the initial capacity and the map may grow beyond this based on the default lf (0.75 by default).
While testing the time series engine in our cluster we found an undercount issue. This is likely because the AggregationOperator was not pulling all the docs matched by the filter.
This PR fixes that by calling
nextBlock
in a loop.Test Plan: Updated the existing UT so that it was able to detect this bug. To do this, the following was done:
sum(orderAmount)
over the entire data, with a single time bucket. The corresponding result was verified with CH.CH Query for loading and testing the new sample data: