Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Clipping schedulers #556

Closed
wants to merge 1 commit into from
Closed

Conversation

Darktex
Copy link
Contributor

@Darktex Darktex commented Jan 20, 2023

Summary:
This diff introduces gradient clipping schedulers that can be used to vary gradient clipping throughout training.

Addresses #375 in OSS.

Differential Revision: D42644261

@facebook-github-bot facebook-github-bot added CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported labels Jan 20, 2023
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D42644261

Darktex added a commit to Darktex/opacus that referenced this pull request Jan 20, 2023
Summary:
Pull Request resolved: pytorch#556

This diff introduces gradient clipping schedulers that can be used to vary gradient clipping throughout training.

Addresses pytorch#375 in OSS.

Differential Revision: D42644261

fbshipit-source-id: 7480e2f81432bd4a05d58af420ee4a783bdd63f1
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D42644261

Darktex added a commit to Darktex/opacus that referenced this pull request Jan 20, 2023
Summary:
Pull Request resolved: pytorch#556

This diff introduces gradient clipping schedulers that can be used to vary gradient clipping throughout training.

Addresses pytorch#375 in OSS.

Differential Revision: D42644261

fbshipit-source-id: 0af54b977d41b164531c18d4804a2506a359ce28
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D42644261

Copy link
Contributor

@karthikprasad karthikprasad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for taking this one on. The changes look good to me, but be sure to fix the lint before merging. :)

)
@settings(deadline=None)
def test_checkpoints(self, noise_scheduler: Optional[Type[StepNoise]]):
def test_checkpoints(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you break this function into multiple smaller functions to make flake8 happy?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, looks like that function was right at the limit for complexity and I crossed it :D Will refactor

Darktex added a commit to Darktex/opacus that referenced this pull request Jan 24, 2023
Summary:
Pull Request resolved: pytorch#556

This diff introduces gradient clipping schedulers that can be used to vary gradient clipping throughout training.

Addresses pytorch#375 in OSS.

Reviewed By: karthikprasad

Differential Revision: D42644261

fbshipit-source-id: 91bedb4c3dd68f336917d16cec42f939ace02406
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D42644261

Darktex added a commit to Darktex/opacus that referenced this pull request Jan 24, 2023
Summary:
Pull Request resolved: pytorch#556

This diff introduces gradient clipping schedulers that can be used to vary gradient clipping throughout training.

Addresses pytorch#375 in OSS.

Reviewed By: karthikprasad

Differential Revision: D42644261

fbshipit-source-id: 94f5fb756dce8ec576cce2ff003c2054eb926e27
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D42644261

Darktex added a commit to Darktex/opacus that referenced this pull request Jan 24, 2023
Summary:
Pull Request resolved: pytorch#556

This diff introduces gradient clipping schedulers that can be used to vary gradient clipping throughout training.

Addresses pytorch#375 in OSS.

Reviewed By: karthikprasad

Differential Revision: D42644261

fbshipit-source-id: a571569785697b369c4c6f9709d35a5d3ff7e73c
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D42644261

Summary:
Pull Request resolved: pytorch#556

This diff introduces gradient clipping schedulers that can be used to vary gradient clipping throughout training.

Addresses pytorch#375 in OSS.

Reviewed By: karthikprasad

Differential Revision: D42644261

fbshipit-source-id: 57c87ba5f0b012359761fa015d046edc5c13da88
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D42644261

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in d888fd0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants