Releases: lxuechen/private-transformers
Releases · lxuechen/private-transformers
v0.2.3
v0.2.2
v0.2.1
-
The codebase now supports the following additional models
- BartForConditionalGeneration (when positional embedding layers are frozen)
- T5ForConditionalGeneration
- OPTForCausalLM
- ViTForImageClassification (when isolated parameters are frozen; see this example)
- DeiTForImageClassification (when isolated parameters are frozen)
- BeitForImageClassification (when isolated parameters are frozen)
-
Minor rewrite of the privacy engine, sharing code between ghost and default clipping, thus making future maintaining potentially easier.
-
Add example file for fine-tuning of vision Transformers on CIFAR-10.
-
Use callback style implementation for playing with params and gradients after clipping.
-
Allow forward and backward hooks to use additional args and kwargs (not necessarily torch.Tensors).
-
Rewrite privacy accounting and remove GDP + CLT, since it under accounts.