Skip to content
This repository was archived by the owner on Apr 8, 2025. It is now read-only.

Latest commit

 

History

History
43 lines (32 loc) · 1.26 KB

README.md

File metadata and controls

43 lines (32 loc) · 1.26 KB

DistPro ECCV 2022

This is the official release of "DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization" ECCV 2022. Alt text

Our method can achieve faster distillation training on ImageNet1K Alt text

Installation

pip install -r requirements.txt

Usage

1. Search distillation process on CIFAR datasets with sample epochs.

python train_with_distillation.py --paper_setting SETTING_CHOICE --epochs 40

2. Retrain the data with full epochs

python train_with_distillation.py --paper_setting SETTING_CHOICE --epochs 240 \
--alpha_normalization_style 333

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

License

Apache-2.0

@inproceedings{deng2022distpro,
  title={DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization},
  author={Xueqing Deng and Dawei Sun and Shawn Newsam and Peng Wang},
  journal={ECCV},
  year={2022}
}