Skip to content

fanyan0411/DSGD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dynamic Sub-graph Distillation for Robust Semi-supervised Continual Learning

Yan Fan  Yu Wang*  Pengfei Zhu  Qinghua Hu

Official PyTorch implementation of our AAAI 2024 paper "Dynamic Sub-graph Distillation for Robust Semi-supervised Continual Learning".

How to run DSGD?

Dependencies

  1. torch 1.8.1
  2. torchvision 0.6.0
  3. tqdm
  4. numpy
  5. scipy
  6. quadprog
  7. POT

Datasets

We have implemented the pre-processing of CIFAR10, CIFAR100, and imagenet100. When training on CIFAR10 and CIFAR100, this framework will automatically download it. When training on imagenet100, you should specify the folder of your dataset in utils/data.py.

    def download_data(self):
        assert 0,"You should specify the folder of your dataset"
        train_dir = '[DATA-PATH]/train/'
        test_dir = '[DATA-PATH]/val/'

Run experiment

  1. Generate the label_index files based on [https://github.com/brain-research/realistic-ssl-evaluation]. (We have also provide our label_index files in ./data/[DATA NAME]_labelindex)
  2. Edit the [MODEL NAME].json file for global settings.
  3. Edit the hyperparameters in the corresponding [MODEL NAME].py file (e.g., models/icarl.py).
  4. Run:
python main.py --config=./exps/[MODEL NAME].json --label_num [NUM OF LABELED DATA]

where [MODEL NAME] should be chosen from icarl, der, icarl_10, der_10 etc.

Acknowledgments

We thank the following repos providing helpful components/functions in our work.

CITATION

If you find our codes or paper useful, please consider giving us a star or cite with:

@inproceedings{fan2024dynamic,
  title={Dynamic Sub-graph Distillation for Robust Semi-supervised Continual Learning},
  author={Fan, Yan and Wang, Yu and Zhu, Pengfei and Hu, Qinghua},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  volume={38},
  number={11},
  pages={11927--11935},
  year={2024}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages