Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Allow model versions to increase in k6 tests #6038

Merged
merged 1 commit into from
Nov 8, 2024

Conversation

sakoush
Copy link
Member

@sakoush sakoush commented Nov 8, 2024

What this PR does / why we need it:

This change adds some randomisation to rollout new versions of a model as opposed to new replicas.

@sakoush sakoush requested a review from lc525 as a code owner November 8, 2024 15:55
const model = generateModel(config.modelType[j], modelName, 1, replicas, config.isSchedulerProxy, config.modelMemoryBytes[j], config.inferBatchSize[j])
const rand = Math.random()
const replicas = Math.floor(rand * config.maxModelReplicas[j]) + 1
const memory = (rand > 0.5)? replicas + "k": "1k" // this will induce a change in memory causing a new version of the model to be created
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's a bit awkward to also update the memory in an util function called "applyModelReplicaChange". But let's add it like this, and I can modify things afterwards.

This is because I'm already modifying both replicas and memory in the control-plane tests, and the replica (and now memory) changes in utils.js have been introduced independently. I'll unify those.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The memory itself is not the target for the test here, I am just trying to induce a version change in the test, if you are changing it later then perhaps its fine to get it merged for now.

Copy link
Member

@lc525 lc525 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, with the comment that I'll update this to be more consistent across scenarios.

const model = generateModel(config.modelType[j], modelName, 1, replicas, config.isSchedulerProxy, config.modelMemoryBytes[j], config.inferBatchSize[j])
const rand = Math.random()
const replicas = Math.floor(rand * config.maxModelReplicas[j]) + 1
const memory = (rand > 0.5)? replicas + "k": "1k" // this will induce a change in memory causing a new version of the model to be created
Copy link
Member

@lc525 lc525 Nov 8, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

more concretely on the change itself: this will set the memory to "replicas" K. I assume we'll always use this with small models, without any actual issue in terms of scheduling onto servers where the memory hint actually matters.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thats a good point, perhaps ideally I should induce a small change to the provided memory by the user.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed will merge as it stands and improve in a follow up PR

@sakoush sakoush added the v2 label Nov 8, 2024
@sakoush sakoush merged commit 13ad524 into SeldonIO:v2 Nov 8, 2024
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants