Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fp16 support #377

Open
ffuuugor opened this issue Mar 11, 2022 · 2 comments
Open

fp16 support #377

ffuuugor opened this issue Mar 11, 2022 · 2 comments
Labels
enhancement New feature or request

Comments

@ffuuugor
Copy link
Contributor

(possibly with autocast and loss scaling, but in our experience with Alex this may result in training instabilities). On the other hand, it reduces memory usage and increases speed (roughly a factor 2 for both).

@ffuuugor ffuuugor added the enhancement New feature or request label Mar 11, 2022
@yuxiang-guo
Copy link

Hi, I wonder whether opacus support fp16. In my experiment, I use the fp16 but get the error:

File "/home/yxguo/anaconda3/envs/ditto/lib/python3.7/site-packages/apex/amp/handle.py", line 110, in scale_loss
if not optimizer._amp_stash.params_have_scaled_gradients:
AttributeError: 'DPOptimizer' object has no attribute '_amp_stash'

@ffuuugor
Copy link
Contributor Author

Hi
As of this moment opacus doesn't support fp16.

This issue is created to track progress of adding the support. As of today it's not something planned for the near future (but we always welcome external pull requests)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants