-
Notifications
You must be signed in to change notification settings - Fork 831
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Allow model versions to increase in k6 tests #6038
Conversation
const model = generateModel(config.modelType[j], modelName, 1, replicas, config.isSchedulerProxy, config.modelMemoryBytes[j], config.inferBatchSize[j]) | ||
const rand = Math.random() | ||
const replicas = Math.floor(rand * config.maxModelReplicas[j]) + 1 | ||
const memory = (rand > 0.5)? replicas + "k": "1k" // this will induce a change in memory causing a new version of the model to be created |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's a bit awkward to also update the memory in an util function called "applyModelReplicaChange". But let's add it like this, and I can modify things afterwards.
This is because I'm already modifying both replicas and memory in the control-plane tests, and the replica (and now memory) changes in utils.js have been introduced independently. I'll unify those.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The memory itself is not the target for the test here, I am just trying to induce a version change in the test, if you are changing it later then perhaps its fine to get it merged for now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm, with the comment that I'll update this to be more consistent across scenarios.
const model = generateModel(config.modelType[j], modelName, 1, replicas, config.isSchedulerProxy, config.modelMemoryBytes[j], config.inferBatchSize[j]) | ||
const rand = Math.random() | ||
const replicas = Math.floor(rand * config.maxModelReplicas[j]) + 1 | ||
const memory = (rand > 0.5)? replicas + "k": "1k" // this will induce a change in memory causing a new version of the model to be created |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
more concretely on the change itself: this will set the memory to "replicas" K. I assume we'll always use this with small models, without any actual issue in terms of scheduling onto servers where the memory hint actually matters.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thats a good point, perhaps ideally I should induce a small change to the provided memory by the user.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As discussed will merge as it stands and improve in a follow up PR
What this PR does / why we need it:
This change adds some randomisation to rollout new versions of a model as opposed to new replicas.