Skip to content

Commit

Permalink
Add BatchMemoryManager reference in the FAQ (#449)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: #449

Related to [this issue](#446).

Reviewed By: Anonymani

Differential Revision: D37646431

fbshipit-source-id: 3f05c7f91c11248a59e82c46f1d56eb2ac1ff604
  • Loading branch information
Pierre Stock authored and facebook-github-bot committed Jul 7, 2022
1 parent fc71e2b commit 181c6b7
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ Your model most likely contains modules that are not compatible with Opacus. The

## What is virtual batch size?

Opacus computes and stores *per-sample* gradients under the hood. What this means is that, for every regular gradient expected by the optimizer, Opacus will store `batch_size` per-sample gradients on each step. To balance peak memory requirement, which is proportional to `batch_size` ^ 2, and training performance, we use virtual batches. With virtual batches we can separate physical steps (gradient computation) and logical steps (noise addition and parameter updates): use larger batches for training, while keeping memory footprint low.
Opacus computes and stores *per-sample* gradients under the hood. What this means is that, for every regular gradient expected by the optimizer, Opacus will store `batch_size` per-sample gradients on each step. To balance peak memory requirement, which is proportional to `batch_size` ^ 2, and training performance, we use virtual batches. With virtual batches we can separate physical steps (gradient computation) and logical steps (noise addition and parameter updates): use larger batches for training, while keeping memory footprint low. See the [Batch Memory Manager](https://opacus.ai/api/batch_memory_manager.html) for seamless integration into your training code.

## What are `alphas`?

Expand Down

0 comments on commit 181c6b7

Please sign in to comment.