Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix activation checkpointing #57

Merged
merged 1 commit into from
Jan 2, 2024
Merged

Conversation

cstub
Copy link
Collaborator

@cstub cstub commented Dec 30, 2023

@cstub cstub requested a review from krasserm December 30, 2023 10:57
@cstub cstub force-pushed the wip-fix-activation-checkpoint branch from 5ced275 to ba4abb4 Compare December 31, 2023 13:13
- Create new function `activation_checkpoint_wrapper` to convert module outputs to be
  compatible with fairscale activation checkpoint wrapper
- Use `static_graph` (see https://pytorch.org/docs/stable/generated/torch.nn.parallel.DistributedDataParallel.html)
  in img_clf training to allow training with activation checkpointing which otherwise
  fails with an error
@cstub cstub force-pushed the wip-fix-activation-checkpoint branch from ba4abb4 to 192b80b Compare December 31, 2023 13:15
Copy link
Owner

@krasserm krasserm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@krasserm krasserm merged commit b08d160 into main Jan 2, 2024
10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants