-
Notifications
You must be signed in to change notification settings - Fork 340
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Throws an error when params in optimizer are not the same as that of module's in make_private
#439
Conversation
This pull request was exported from Phabricator. Differential Revision: D37163873 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Deepak! Looks good. Could you also add a simple unit test in privacy_engine_test for this check?
…module's in `make_private` (pytorch#439) Summary: Pull Request resolved: pytorch#439 Compare nn.Module.parameters() with list of parameters from all param_groups of optimizer. If they are all not equal then raise error "Module parameters are different than optimizer Parameters" Differential Revision: D37163873 fbshipit-source-id: 73a8c3b5afa1b1e080c6fe7ed77a4a6130c9fdf9
2be899b
to
99b03c8
Compare
This pull request was exported from Phabricator. Differential Revision: D37163873 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D37163873 |
…module's in `make_private` (pytorch#439) Summary: Pull Request resolved: pytorch#439 Compare nn.Module.parameters() with list of parameters from all param_groups of optimizer. If they are all not equal then raise error "Module parameters are different than optimizer Parameters" Differential Revision: D37163873 fbshipit-source-id: 8e25fa1738f08c5aa52f856023f72948164d6f0e
99b03c8
to
bdaab58
Compare
Thanks for adding the tests @deepakagrawal! Please also fix the lint failure and it is good to go after that :) |
This pull request was exported from Phabricator. Differential Revision: D37163873 |
…module's in `make_private` (pytorch#439) Summary: Pull Request resolved: pytorch#439 Compare nn.Module.parameters() with list of parameters from all param_groups of optimizer. If they are all not equal then raise error "Module parameters are different than optimizer Parameters" Differential Revision: D37163873 fbshipit-source-id: 96cd2bc3ce9a136e450d5927b602b82f8072af9c
bdaab58
to
3ed0b5d
Compare
…module's in `make_private` (pytorch#439) Summary: Pull Request resolved: pytorch#439 Compare nn.Module.parameters() with list of parameters from all param_groups of optimizer. If they are all not equal then raise error "Module parameters are different than optimizer Parameters" Differential Revision: D37163873 fbshipit-source-id: de7e96f02d0d3d711032c9d77e22ae48bb06f0bb
3ed0b5d
to
f9e746c
Compare
This pull request was exported from Phabricator. Differential Revision: D37163873 |
…module's in `make_private` (pytorch#439) Summary: Pull Request resolved: pytorch#439 Compare nn.Module.parameters() with list of parameters from all param_groups of optimizer. If they are all not equal then raise error "Module parameters are different than optimizer Parameters" Differential Revision: D37163873 fbshipit-source-id: 5098af6123c4280119276bd7a7d0310118fc8914
f9e746c
to
ef58c21
Compare
This pull request was exported from Phabricator. Differential Revision: D37163873 |
…module's in `make_private` (pytorch#439) Summary: Pull Request resolved: pytorch#439 Compare nn.Module.parameters() with list of parameters from all param_groups of optimizer. If they are all not equal then raise error "Module parameters are different than optimizer Parameters" Differential Revision: D37163873 fbshipit-source-id: b4e27d4b7879c5804424b2d1b18921409073d267
ef58c21
to
8ca0b41
Compare
This pull request was exported from Phabricator. Differential Revision: D37163873 |
…module's in `make_private` (pytorch#439) Summary: Pull Request resolved: pytorch#439 Compare nn.Module.parameters() with list of parameters from all param_groups of optimizer. If they are all not equal then raise error "Module parameters are different than optimizer Parameters" Differential Revision: D37163873 fbshipit-source-id: cf2d89ae50ed7387406a0cea362994fb975d2db1
8ca0b41
to
7b1ad07
Compare
This pull request was exported from Phabricator. Differential Revision: D37163873 |
…module's in `make_private` (pytorch#439) Summary: Pull Request resolved: pytorch#439 Compare nn.Module.parameters() with list of parameters from all param_groups of optimizer. If they are all not equal then raise error "Module parameters are different than optimizer Parameters" Differential Revision: D37163873 fbshipit-source-id: b2f9d3d6e4efda335ff47e110593d96b19188094
7b1ad07
to
7fcd0e4
Compare
This pull request was exported from Phabricator. Differential Revision: D37163873 |
@karthikprasad can you please check the errors. I think they are not related to the changes I made. I fixed the lint failure. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. Thank you @deepakagrawal!
…module's in `make_private` (pytorch#439) Summary: Pull Request resolved: pytorch#439 Compare nn.Module.parameters() with list of parameters from all param_groups of optimizer. If they are all not equal then raise error "Module parameters are different than optimizer Parameters" Reviewed By: karthikprasad Differential Revision: D37163873 fbshipit-source-id: daca6711570dd7006c1aaf7f757c3d81dbaaeda1
This pull request was exported from Phabricator. Differential Revision: D37163873 |
7fcd0e4
to
baa9502
Compare
@karthikprasad pushed the latest commit to github. I will land the diff as well. Can you please merge this on github. |
Summary: Compare nn.Module.parameters() with list of parameters from all param_groups of optimizer. If they are all not equal then raise error "Module parameters are different than optimizer Parameters"
Differential Revision: D37163873