Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix: Passing through defaults in DPOptimizer #329

Conversation

alexandresablayrolles
Copy link
Contributor

@alexandresablayrolles alexandresablayrolles commented Jan 18, 2022

Problem

Described in #322. The code using schedules can be broken because DPOptimizer doesn't have .defaults field. For example, when using NAdam, schedulers with cycle_momentum=True.

Solution

As suggested by @gkaissis, just passing through the defaults field to DPOptimizer.

Differential Revision: D33634214

Summary: Adding `defaults` to DPOptimizer to match Pytorch Optimizer

Differential Revision: D33634214

fbshipit-source-id: da2f8797da05860be68b90f99e181d6d94200755
@facebook-github-bot facebook-github-bot added CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported labels Jan 18, 2022
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D33634214

@romovpa
Copy link
Contributor

romovpa commented Jan 19, 2022

@alexandresablayrolles Could you please add more comments: why we need to match defaults? What is the usage of this?
Could there be other fields we need to match?

UPD: Found the original issue, updated the description.

@romovpa romovpa changed the title Adding defaults to DPOptimizer to match Pytorch Optimizer Fix: Passing through defaults in DPOptimizer Jan 19, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

DPOptimizer should pass through defaults
3 participants