-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Introduce prepare for eval, fix evaluation bug #789
Conversation
MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅ |
@fsschneider can we get your LGTM on this? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for the very late review.
It looks good to me! Thanks for all the work Niccolò!
Quick questions, the changes to the reference_algorithms
and prize_qualification
algorithms aren't strictly necessary, right? Since we now default to a no-op if prepare_for_eval
is not defined.
I fully agree that making it backward-compatible in that way is really helpful.
Again thanks for this crucial PR!
Thank you for reviewing! Correct, the updates to I figured it would be nice to have all the implemented submissions reflecting the latest stage of the code, but I can reverse those changes if needed! (: |
I agree with you. No need to revert, just wanted to check if I understand it correctly :) |
Description
This pull request introduces a
prepare_for_eval
function and updates the code to support it.The implementation follows the blueprint of @fsschneider in #719 (comment) and fixes the bug of giving a free evaluation to a submission that goes out of max_runtime (again #719 (comment)).
Function signature
The arguments of
prepare_for_eval
are the same asupdate_params
, except forbatch
. I believe thatprepare_for_eval
should indeed be agnostic to the last batch used during training. The return type is the same asupdate_params
.List of changes
In
submission_runner.py
:prepare_for_eval
profiler
del batch
beforeprepare_for_eval
(instead than before evaluation)accumulated_submission_time
afterprepare_for_eval
is_time_remaining
afterprepare_for_eval
is_time_remaining
prep_eval_rng
Minor changes:
PrepareForEvalFn
tospec
prepare_for_eval
to submission templateprepare_for_eval
to all pytorch and jax submissionsFixes #719 and #758 .