Tags: meta-pytorch/opacus
Tags
Opacus release v1.5.2 (#663) Summary: Pull Request resolved: #663 Release a new version of Opacus Furthermore, we replace "opt_einsum.contract" by torch.einsum to avoid errors when "opt_einsum" is not available. This will not hurt the performance since torch will automatically shift to "opt_einsum" for acceleratiton when the package is available (https://pytorch.org/docs/stable/generated/torch.einsum.html) Code pointer: https://pytorch.org/docs/stable/_modules/torch/backends/opt_einsum.html#is_available Reviewed By: EnayatUllah Differential Revision: D60672828 fbshipit-source-id: f8bbc0aa404e48f15ce129689a6e55af68daa5e4
up Pytorch version for PyPI (#568) Summary: Deploy failed due to unavailable Python https://github.com/pytorch/opacus/actions/runs/4241660361 Pull Request resolved: #568 Reviewed By: karthikprasad Differential Revision: D43494621 Pulled By: ffuuugor fbshipit-source-id: d8ae67e0133ee963715b34966ae7193156def309
Add third blog series to the website (#537) Summary: ## Types of changes - [ ] Bug fix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Breaking change (fix or feature that would cause existing functionality to change) - [x] Docs change / refactoring / dependency upgrade ## Motivation and Context / Related issue Add third blog series to the website ## How Has This Been Tested (if it applies) No test is required ## Checklist - [x] The documentation is up-to-date with the changes I made. - [x] I have read the **CONTRIBUTING** document and completed the CLA (see **CONTRIBUTING**). - [x] All tests passed, and additional code has been covered with new tests. Pull Request resolved: #537 Reviewed By: karthikprasad Differential Revision: D41042020 Pulled By: ashkan-software fbshipit-source-id: d10d65bb3d56620a01a57cc45da12b87041b1354
No-Op GradSampleModule (#492) Summary: TL;DR: Adding a No-Op GradSampleModule in case the grad samples are computed by functorch. The CIFAR10 example has been updated to show a typical use-case for that. The neat thing about functorch is that it directly gives the per-sample gradients with a couple of lines of code. These per-sample gradients are then manually given to `p.grad_sample` by the end-user. Pull Request resolved: #492 Reviewed By: ffuuugor Differential Revision: D39204008 Pulled By: alexandresablayrolles fbshipit-source-id: 22036e6c941522bba7749ef46f97d54f6ee8c551
PreviousNext