Yan Fan Yu Wang* Pengfei Zhu Qinghua Hu
Official PyTorch implementation of our AAAI 2024 paper "Dynamic Sub-graph Distillation for Robust Semi-supervised Continual Learning".
We have implemented the pre-processing of CIFAR10
, CIFAR100
, and imagenet100
. When training on CIFAR10
and CIFAR100
, this framework will automatically download it. When training on imagenet100
, you should specify the folder of your dataset in utils/data.py
.
def download_data(self):
assert 0,"You should specify the folder of your dataset"
train_dir = '[DATA-PATH]/train/'
test_dir = '[DATA-PATH]/val/'
- Generate the label_index files based on [https://github.com/brain-research/realistic-ssl-evaluation]. (We have also provide our label_index files in ./data/[DATA NAME]_labelindex)
- Edit the
[MODEL NAME].json
file for global settings. - Edit the hyperparameters in the corresponding
[MODEL NAME].py
file (e.g.,models/icarl.py
). - Run:
python main.py --config=./exps/[MODEL NAME].json --label_num [NUM OF LABELED DATA]
where [MODEL NAME] should be chosen from icarl
, der
, icarl_10
, der_10
etc.
We thank the following repos providing helpful components/functions in our work.
If you find our codes or paper useful, please consider giving us a star or cite with:
@inproceedings{fan2024dynamic,
title={Dynamic Sub-graph Distillation for Robust Semi-supervised Continual Learning},
author={Fan, Yan and Wang, Yu and Zhu, Pengfei and Hu, Qinghua},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
volume={38},
number={11},
pages={11927--11935},
year={2024}
}