[ICML 2021] "Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training" by Shiwei Liu, Lu Yin, Decebal Constantin Mocanu, Mykola Pechenizkiy
-
Updated
Nov 11, 2023 - Python
[ICML 2021] "Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training" by Shiwei Liu, Lu Yin, Decebal Constantin Mocanu, Mykola Pechenizkiy
[ICLR 2022] "Sparsity Winning Twice: Better Robust Generalization from More Efficient Training" by Tianlong Chen*, Zhenyu Zhang*, Pengjun Wang*, Santosh Balachandra*, Haoyu Ma*, Zehao Wang, Zhangyang Wang
[ICLR 2023] "Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!" Shiwei Liu, Tianlong Chen, Zhenyu Zhang, Xuxi Chen, Tianjin Huang, AJAY KUMAR JAISWAL, Zhangyang Wang
[ICML 2021] "Selfish Sparse RNN Training" by Shiwei Liu, Decebal Constantin Mocanu, Yulong Pei, Mykola Pechenizkiy
[TMLR] Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks
Custom library for neural networks that integrates with TensorFlow & PyTorch. Dynamically adjusts the sparsity of connections during training to optimize resources.
Add a description, image, and links to the dynamic-sparse-training topic page so that developers can more easily learn about it.
To associate your repository with the dynamic-sparse-training topic, visit your repo's landing page and select "manage topics."