Implementation for the paper "SpaceNet: Make Free Space For Continual Learning" in PyTorch.
-
Updated
Feb 28, 2021 - Python
Implementation for the paper "SpaceNet: Make Free Space For Continual Learning" in PyTorch.
Always sparse. Never dense. But never say never. A Sparse Training repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).
[ICML 2021] "Selfish Sparse RNN Training" by Shiwei Liu, Decebal Constantin Mocanu, Yulong Pei, Mykola Pechenizkiy
Code for "Training Adversarially Robust Sparse Networks via Bayesian Connectivity Sampling" [ICML 2021]
[IJCAI 2022] "Dynamic Sparse Training for Deep Reinforcement Learning" by Ghada Sokar, Elena Mocanu , Decebal Constantin Mocanu, Mykola Pechenizkiy, and Peter Stone.
[ICML2022] Training Your Sparse Neural Network Better with Any Mask. Ajay Jaiswal, Haoyu Ma, Tianlong Chen, ying Ding, and Zhangyang Wang
PyTorch Implementation of TopKAST
End-to-end training of sparse deep neural networks with little-to-no performance loss.
[TMLR] Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks
This is the repository for the SNN-22 Workshop paper on "Generalization and Memorization in Sparse Neural Networks".
[Machine Learning Journal (ECML-PKDD 2022 journal track)] A Brain-inspired Algorithm for Training Highly Sparse Neural Networks
[Machine Learning Journal (ECML-PKDD 2022 journal track)] Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders
[ICML 2021] "Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training" by Shiwei Liu, Lu Yin, Decebal Constantin Mocanu, Mykola Pechenizkiy
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Add a description, image, and links to the sparse-training topic page so that developers can more easily learn about it.
To associate your repository with the sparse-training topic, visit your repo's landing page and select "manage topics."