Skip to content

bytedance/DistPro

Repository files navigation

DistPro ECCV 2022

This is the official release of "DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization" ECCV 2022. Alt text

Our method can achieve faster distillation training on ImageNet1K Alt text

Installation

pip install -r requirements.txt

Usage

1. Search distillation process on CIFAR datasets with sample epochs.

python train_with_distillation.py --paper_setting SETTING_CHOICE --epochs 40

2. Retrain the data with full epochs

python train_with_distillation.py --paper_setting SETTING_CHOICE --epochs 240 \
--alpha_normalization_style 333

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

License

Apache-2.0

@inproceedings{deng2022distpro,
  title={DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization},
  author={Xueqing Deng and Dawei Sun and Shawn Newsam and Peng Wang},
  journal={ECCV},
  year={2022}
}

About

No description or website provided.

Topics

Resources

License

Apache-2.0, Unknown licenses found

Licenses found

Apache-2.0
LICENSE
Unknown
license.txt

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages