Skip to content

Commit

Permalink
Fix DistributedDP Optimizer for Fast Gradient Clipping (#662)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: #662

The step function incorrectly called "original_optimizer.original_optimizer" instead of  "original_optimizer". Fixed it now.

Reviewed By: HuanyuZhang

Differential Revision: D60484128

fbshipit-source-id: 1bde00292b2afccc31803ebefb2c361dc7e9bb77
  • Loading branch information
EnayatUllah authored and facebook-github-bot committed Aug 2, 2024
1 parent 4804a51 commit eb94674
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 2 deletions.
3 changes: 2 additions & 1 deletion opacus/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,14 +14,15 @@
# limitations under the License.

from . import utils
from .grad_sample import GradSampleModule
from .grad_sample import GradSampleModule, GradSampleModuleFastGradientClipping
from .privacy_engine import PrivacyEngine
from .version import __version__


__all__ = [
"PrivacyEngine",
"GradSampleModule",
"GradSampleModuleFastGradientClipping",
"utils",
"__version__",
]
2 changes: 1 addition & 1 deletion opacus/optimizers/ddpoptimizer_fast_gradient_clipping.py
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,6 @@ def step(

if self.pre_step():
self.reduce_gradients()
return self.original_optimizer.original_optimizer.step()
return self.original_optimizer.step()
else:
return None

0 comments on commit eb94674

Please sign in to comment.