- 0. Overview
- 1. When to prune
- 1.1 Static Pruning
- 1.1.1 Pruning Before Training
- 1.1.2 Pruning During Training
- 1.1.3 Pruning After Training
- 1.1.4 Pruning In Early Training
- 1.2 Dynamic Pruning
- 1.1 Static Pruning
- 2. Learning and Pruning
- 3. Application
- 4. Combination
- 5. Survey of Pruning
- 6. Other Works
- Acknowledgements
- Citation
The repo includes the ongoing updates of representative neural network pruning papers and open-source codes.
Our paper [A Survey on Deep Neural Network Pruning-Taxonomy, Comparison, Analysis, and Recommendations] (Paper Link), accepted by TPAMI 2024.
Taxonomy: In our survey, we provide a comprehensive review of the state-of-the-art in deep neural network pruning, which we categorize along five orthogonal axes: Universal/Specific Speedup, When to Prune, Pruning Criteria, Learn to Prune, and Fusion of Pruning and Other Techniques.
Type | L |
F |
C |
N |
H |
B |
M |
E |
W |
P |
Other |
---|---|---|---|---|---|---|---|---|---|---|---|
Explanation | Layer pruning | Filter pruning | Channel pruning | Neuron pruning | Head pruning | Block pruning | Matrix pruning | Embedding pruning | Weight pruning | Pioneer work | other types |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | No Free Prune: Information-Theoretic Barriers to Pruning at Initialization | ICML | W |
- | - | Image Classification | 2024 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Data-Free Model Pruning at Initialization via Expanders | CVPRW | W |
RReg | PyTorch(Author) | Image Classification | 2023 |
02 | Revisiting Pruning as Initialization through the Lens of Ramanujan Graph | ICLR (TOP 5%) | W |
- | PyTorch(Author) | Image Classification | 2023 |
03 | Pruning at Initialization - A Sketching Perspective | arXiv | W |
- | - | Image Classification | 2023 |
04 | NTK-SAP: Improving neural network pruning by aligning training dynamics | ICLR | W |
NTK-SAP | PyTorch(Author) | Image Classification | 2023 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Prospect Pruning: Finding Trainable Weights at Initialization using Meta-Gradients | ICLR | WF |
ProsPr | PyTorch(Author) | Image Classification | 2022 |
02 | Dual Lottery Ticket Hypothesis | ICLR | W |
RST | PyTorch(Author) | Image Classification | 2022 |
03 | Recent Advances on Neural Network Pruning at Initialization | IJCAI | W |
- | PyTorch(Author) | Image Classification | 2022 |
04 | The Unreasonable Effectiveness of Random Pruning: Return of the Most Naive Baseline for Sparse Training | ICLR | W |
- | PyTorch(Author) | Image Classification | 2022 |
05 | Structured Pruning is All You Need for Pruning CNNs at Initialization | arXiv | C |
PreCropping | - | Image Classification | 2022 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Progressive Skeletonization: Trimming More Fat from a network at initialization | ICLR | W |
FORCE | PyTorch(Author) | Image Classification | 2021 |
02 | Robust Pruning at Initialization | ICLR | W |
SPB | - | Image Classification | 2021 |
03 | A Unified Paths Perspective for Pruning at Initialization | arXiv | W |
- | - | Image Classification | 2021 |
04 | Prunining Neural Networks at Initialization: Why are We Missing the Mark? | ICLR | W |
- | - | Image Classification | 2021 |
05 | Why is Pruning at Initialization Immune to Reinitializating and Shuffling?) | arXiv | W |
- | - | Image Classification | 2021 |
06 | Dense for the Price of Sparse: Improved Performance of Sparsely Initialized Networks via a Subspace Offset) | ICML | W |
DCTpS | PyTorch(Author) | Image Classification | 2021 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | SNIP: Single-shot Network Pruning based on Connection Sensitivity | ICLR | WP |
SNIP | TensorFLow(Author) | Image Classification | 2019 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Auto-Train-Once: Controller Network Guided Automatic Network Pruning from Scratch | CVPR | W |
ATO | PyTorch(Author) | Image Classification | 2024 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | PDP: Parameter-free Differentiable Pruning is All You Need | NeurIPS | WC |
- | - | Vision&NLP | 2023 |
02 | LAPP: Layer Adaptive Progressive Pruning for Compressing CNNs from Scratch | arXiv | F |
LAPP | - | Image Classification | 2023 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | SuperTickets: Drawing Task-Agnostic Lottery Tickets from Supernets via Jointly Architecture Searching and Parameter Pruning | ECCV | W |
SuperTickets | PyTorch(Author) | Image Classification&Object Detection&Human Pose Estimation | 2022 |
02 | Deep ensembling with no overhead for either training or testing: The all-round blessings of dynamic sparsity | ICLR | W |
FreeTickets | PyTorch(Anthor) | Image Classification | 2022 |
03 | Gradient Flow in Sparse Neural Networks and How Lottery Tickets Win | AAAI | W |
- | PyTorch(Anthor) | Image Classification | 2022 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Training Adversarially Robust Sparse Networks via Bayesian Connectivity Sampling | ICML | W |
- | PyTorch(Anthor) | Adversarial Robustness | 2021 |
02 | Training Neural Networks with Fixed Sparse Masks | NeurIPS | W |
- | PyTorch(Author) | Image Classification | 2021 |
03 | DPFPS: Dynamic and Progressive Filter Pruning for Compressing Convolutional Neural Networks from Scratch | AAAI | C |
DPFPS | PyTorch(Author) | Image Classification | 2021 |
04 | Sparse Training via Boosting Pruning Plasticity with Neuroregeneration | NeurIPS | WF |
GraNet | PyTorch(Author) | Image Classification | 2021 |
05 | Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training | ICML | W |
ITOP | PyTorch(Anthor) | Image Classification | 2021 |
06 | Dense for the Price of Sparse: Improved Performance of Sparsely Initialized Networks via a Subspace Offset | ICML | W |
DCTpS | PyTorch(Anthor) | Image Classification | 2021 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Pruning Filter in Filter | NeurIPS | Other |
SWP | PyTorch(Author) | Image Classification | 2020 |
02 | Dynamic Sparse Training: Find Effective Sparse Network from Scratch with Trainable Masked Layers | ICLR | NF |
DST | PyTorch(Author) | Image Classification | 2020 |
03 | DSA: More Efficient Budgeted Pruning via Differentiable Sparsity Allocation | ECCV | F |
DSA | PyTorch(Author) | Image Classification | 2020 |
04 | Dynamic Model Pruning with Feedback | ICLR | WF |
DPF | PyTorch(3rd) | Image Classification | 2020 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Exploring Sparsity in Recurrent Neural Networks | ICLR | W |
- | PyTorch | Speech Recognition | 2017 |
02 | Sparse Training via Boosting Pruning Plasticity with Neuroregeneration | NeurIPS | H |
GraNet | PyTorch | Image Classification | 2021 |
03 | Selfish Sparse RNN Training | ICML | W |
SNT-ASGD | PyTorch(Anthor) | Language Modeling | 2021 |
04 | Dynamic Sparse Training for Deep Reinforcement Learning | IJCAI | W |
- | PyTorch(Anthor) | Continuous Control | 2022 |
05 | The State of Sparse Training in Deep Reinforcement Learning. | ICML | W |
- | Tensorflow(Anthor) | Continuous Control | 2022 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Fast and Controllable Post-training Sparsity: Learning Optimal Sparsity Allocation with Global Constraint in Minutes | AAAI | W |
FCPTS | - | Image Classification&Object Detection | 2024 |
02 | UPDP: A Unified Progressive Depth Pruner for CNN and Vision Transformer | AAAI | L |
UPDP | - | Image Classification&Object Detection | 2024 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Unified Data-Free Compression: Pruning and Quantization without Fine-Tuning | ICCV | C |
UDFC | - | Image Classification | 2023 |
02 | Unmasking the Lottery Ticket Hypothesis: What’s Encoded in a Winning Ticket’s Mask? | ICLR(TOP-25%) | W |
- | - | Image Classification | 2023 |
03 | DepGraph: Towards Any Structural Pruning | CVPR | C |
DepGraph | PyTorch(Author) | CV/NLP | 2023 |
04 | DFPC: Data flow driven pruning of coupled channels without data | ICLR | C |
DFPC | PyTorch(Author) | Image Classification | 2023 |
05 | Memory-Oriented Structural Pruning for Efficient Image Restoration | AAAI | C |
MOSP | - | Image Restoration | 2023 |
06 | Trainability Preserving Nueral Structured Pruning | ICLR | F |
TPP | Pytorch(Author) | Image Classification | 2023 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Optimal Brain Damage | NIPS | W |
OBD | - | Image Classification | 1989 |
02 | Second Order Derivatives for Network Pruning: Optimal Brain Surgeon | NIPS | W |
OBS | - | Image Classification | 1992 |
03 | Structured Pruning of Deep Convolutional Neural Networks | arXiv | C |
- | - | Image Classification | 2015 |
04 | Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding | ICLR (Best) | W |
- | Caffe(Author) | Image Classification | 2016 |
05 | ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression | ICCV&TPAMI | F |
ThiNet | Caffe(Author), PyTorch(3rd) | Image Classification | 2017&2019 |
06 | Pruning Convolutional Neural Networks for Resource Efficient Inference | ICLR | F |
- | PyTorch | Image Classification | 2017 |
07 | Pruning Filters for Efficient ConvNets | ICLR | F |
PFEC | PyTorch(3rd) | Image Classification | 2017 |
08 | Channel pruning for accelerating very deep neural networks | ICCV | C |
- | Caffe(Author) | Image Classification&Object Detection | 2017 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Fast and Controllable Post-training Sparsity: Learning Optimal Sparsity Allocation with Global Constraint in Minutes | AAAI | W |
FCPTS | - | Image Classification&Object Detection | 2024 |
02 | UPDP: A Unified Progressive Depth Pruner for CNN and Vision Transformer | AAAI | L |
UPDP | - | Image Classification&Object Detection | 2024 |
03 | Pruning Self-attentions into Convolutional Layers in Single Path | TPAMI | H |
SPViT | PyTorch | Image Classification&Object Detection | 2024 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | X-Pruner: eXplainable Pruning for Vision Transformers | CVPR | CH |
X-Pruner | Pytorch(Author) | Image Classification | 2023 |
02 | Global Vision Transformer Pruning with Hessian-Aware Saliency | CVPR | CH |
NViT | - | Image Classification | 2023 |
03 | Pruning Parameterization with Bi-level Optimization for Efficient Semantic Segmentation on the Edge | CVPR | W |
STE | - | semantic Segmentation | 2023 |
04 | Instant Soup: Cheap Pruning Ensembles in A Single Pass Can Draw Lottery Tickets from Large Models | ICML | W |
ISP | Pytorch(Author) | Image Classification&NLP | 2023 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Width & Depth Pruning for Vision Transformers | AAAI | C |
WDPruning | Pytorch(Author) | Image Classification | 2022 |
02 | SAViT: Structure-Aware Vision Transformer Pruning via Collaborative Optimization | NeurIPS | CHE |
SAViT | Pytorch(Author) | Image Classification&object detection | 2022 |
03 | VTC-LFC: Vision Transformer Compression with Low-Frequency Components | NeurIPS | C |
VTC-LFC | Pytorch(Author) | Image Classification | 2022 |
04 | CP-ViT: Cascade Vision Transformer Pruning via Progressive Sparsity Prediction | arXiv | H |
CP-ViT | - | Image Classification | 2022 |
05 | Unified Visual Transformer Compression | ICLR | H |
UVC | Pytorch(Author) | Image Classification | 2022 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | LoSparse: Structured Compression of Large Language Models based on Low-Rank and Sparse Approximation | ICML | H |
LoSparse | PyTorch(Author) | NLP | 2023 |
02 | Instant Soup: Cheap Pruning Ensembles in A Single Pass Can Draw Lottery Tickets from Large Models | ICML | W |
ISP | Pytorch(Author) | Image Classification&NLP | 2023 |
03 | Gradient-Free Structured Pruning with Unlabeled Data | ICML | F |
KCM | - | NLP | 2023 |
04 | The Emergence of Essential Sparsity in Large Pre-trained Models: The Weights that Matter | arXiv | W &N:M |
- | Pytorch(Author) | NLP | 2023 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Structured Pruning Learns Compact and Accurate Models | ACL | LH |
CoFi | PyTorch(Author) | Natural Language Understanding | 2022 |
02 | From Dense to Sparse: Contrastive Pruning for Better Pre-trained Language Model Compression | AAAI | WH |
CAP | PyTorch(Author) | NLP | 2022 |
03 | PLATON: Pruning Large Transformer Models with Upper Confidence Bound of Weight Importance | ICML | WC |
PLATON | PyTorch(Author) | Natural Language Understanding&Question Answering&Image Classification | 2022 |
04 | Parameter-Efficient Sparsity for Large Language Models Fine-Tuning | IJCAI | W |
PST | PyTorch(Author) | Language Modeling | 2022 |
05 | The Optimal BERT Surgeon: Scalable and Accurate Second-Order Pruning for Large Language Models | EMNLP | W |
oBERT | PyTorch(Author) | Natural Language Understanding | 2022 |
06 | Optimal Brain Compression: A Framework for Accurate Post-Training Quantization and Pruning | NeurIPS | W |
ExactOBS | PyTorch(Author) | Image Classification&Object Detection&Question Answering | 2022 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
03 | Train Large, Then Compress: Rethinking Model Size for Efficient Training and Inference of Transformers | ICML | W |
- | - | NLP | 2020 |
04 | When BERT Plays the Lottery, All Tickets Are Winning | EMNLP | W |
- | PyTorch(Author) | Language Modeling | 2020 |
05 | LadaBERT: Lightweight Adaptation of BERT through Hybrid Model Compression | COLING | W |
- | - | NLP(Sentiment Classification,Natural Language Inference,Pairwise Semantic Equivalence) | 2020 |
06 | Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior | EMNLP | H |
- | - | NLP | 2020 |
07 | Compressing BERT: Studying the Effects of Weight Pruning on Transfer Learning | Rep4NLP | W |
- | - | NLP | 2020 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Reweighted Proximal Pruning for Large-Scale Language Representation | arXiv | Other |
- | - | NLP | 2019 |
02 | Efficient Transformer-based Large Scale Language Representations using Hardware-friendly Block Structured Pruning | EMNLP | Other |
- | - | NLP | 2019 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | SparseGPT: Massive Language Models Can be Accurately Pruned in One-Shot | NeurIPS | WP |
- | PyTorch(Author) | Language Modeling&Classification | 2023 |
02 | LLM-Pruner: On the Structural Pruning of Large Language Models | arXiv | CHP |
LLM-Pruner | PyTorch(Author) | Language Modeling&Language Generation&Classification | 2023 |
03 | LoRAShear: Efficient Large Language Model Structured Pruning and Knowledge Recovery | arXiv | CH |
LoRAShear | - | Language Modeling&Language Generation&Classification | 2023 |
04 | Compresso: Structured Pruning with Collaborative Prompting Learns Compact Large Language Models | arXiv | CH |
Compresso | PyTorch(Author) | Classification | 2023 |
05 | Mini-GPTs: Efficient Large Language Models through Contextual Pruning | arXiv | WC |
- | - | Language Modeling& Classification | 2023 |
06 | The Emergence of Essential Sparsity in Large Pre-trained Models: The Weights that Matter | arXiv | W &N:M |
- | Pytorch(Author) | NLP | 2023 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Structural Pruning for Diffusion Models | NeurIPS | C |
Diff-Pruning | PyTorch(Author) | Image Generation | 2023 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | ECoFLaP: Efficient Coarse-to-Fine Layer-Wise Pruning for Vision-Language Models | ICLR | L |
ECoFLaP | Pytorch(Author) | VQA&Image Captioning&Image-text Retrieval&Image Classification | 2024 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Large Multimodal Model Compression via Efficient Pruning and Distillation at AntGroup | arXiv | B |
- | - | Multimodal Advertisement Audition | 2023 |
02 | UPop: Unified and Progressive Pruning for Compressing Vision-Language Transformers | ICML | H |
UPop | Pytorch(Author) | Image Classification&Image Caption&Image Retrieval&VQA | 2023 |
03 | Instant Soup: Cheap Pruning Ensembles in A Single Pass Can Draw Lottery Tickets from Large Models | ICML | W |
ISP | Pytorch(Author) | Image Classification&NLP | 2023 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Playing Lottery Tickets with Vision and Language | AAAI | W |
- | - | Vision-and-Language | 2022 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Fast and Controllable Post-training Sparsity: Learning Optimal Sparsity Allocation with Global Constraint in Minutes | AAAI | W |
FCPTS | - | Image Classification | 2024 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | SparseGPT: Massive Language Models Can be Accurately Pruned in One-Shot | NeurIPS | WP |
- | PyTorch(Author) | Language Modeling | 2023 |
02 | Unified Data-Free Compression: Pruning and Quantization without Fine-Tuning | ICCV | C |
UDFC | - | Image Classification | 2023 |
03 | OTOv3: Automatic Architecture-Agnostic Neural Network Training and Compression from Structured Pruning to Erasing Operators | arXiv | WFC |
- | - | Image Classification | 2023 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | CP-ViT: Cascade Vision Transformer Pruning via Progressive Sparsity Prediction | arXiv | H |
CP-ViT | - | Image Classification | 2022 |
02 | Optimal Brain Compression: A Framework for Accurate Post-Training Quantization and Pruning | NeurIPS | W |
ExactOBS | PyTorch(Author) | Image Classification&Object Detection&Question Answering | 2022 |
03 | A Fast Post-Training Pruning Framework for Transformers | NeurIPS | HF |
- | PyTorch(Author) | Natural Language Understanding | 2022 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Enabling Retrain-free Deep Neural Network Pruning Using Surrogate Lagrangian Relaxation | IJCAI | W |
- | - | Image Classification & Object Detection | 2021 |
02 | Accelerated Sparse Neural Training: A Provable and Efficient Method to Find N:M Transposable Masks | NeurIPS | N:M | AdaPrune | PyTorch(Author) | Image Classification | 2021 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Linear Mode Connectivity and the Lottery Ticket Hypothesis | ICML | W |
- | - | Image Classification | 2020 |
02 | When To Prune? A Policy Towards Early Structural Pruning | CVPR | F |
PaT | - | Image Classification | 2022 |
03 | Drawing Early-Bird Tickets: Towards More Efficient Training of Deep Networks | ICLR | W |
- | PyTorch(Author) | Image Classification | 2020 |
04 | A Gradient Flow Framework For Analyzing Network Pruning | ICLR | F |
- | PyTorch(Author) | Image Classification | 2021 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Channel Gating Neural Networks | NeurIPS | F |
RNP | - | Image Classification | 2017 |
02 | Channel Gating Neural Networks | NeurIPS | C |
CGNet | PyTorch(Author) | Image Classification | 2019 |
03 | Dynamic Channel Pruning: Feature Boosting and Suppression | ICLR | C |
FBS | PyTorch(Author) | Image Classification | 2019 |
04 | Frequency-Domain Dynamic Pruning for Convolutional Neural Networks | NeurIPS | F |
FDNP | - | Image Classification | 2019 |
05 | Fire Together Wire Together: A Dynamic Pruning Approach With Self-Supervised Mask Prediction | CVPR | F |
- | - | Image Classification | 2019 |
06 | Dynamic Dual Gating Neural Networks | ICCV | C |
DGNet | PyTorch(Author) | Image Classification | 2021 |
07 | Manifold Regularized Dynamic Network Pruning | CVPR | F |
ManiDP | PyTorch(Author) | Image Classification | 2021 |
08 | Contrastive Dual Gating: Learning Sparse Features With Contrastive Learning | CVPR | WF |
CDG | - | Image Classification | 2022 |
No. | Title | Venue | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|
01 | Continual Learning via Neural Pruning | arXiv | CLNP | - | Image Classification | 2019 |
02 | Learning Bayesian Sparse Networks With Full Experience Replay for Continual Learning | CVPR | SNCL | - | Image Classification | 2022 |
03 | Continual Prune-and-Select: Class-Incremental Learning with SPecialized Subnetworks | Applied Intelligence | - | PyTorch(Author) | Image Classification | 2023 |
04 | Continual Domain Adaptation through Pruning-aided Domain-specific Weight Modulation | CVPRW | PaCDA | PyTorch(Author) | Image Classification | 2023 |
No. | Title | Venue | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|
01 | Studying the impact of magnitude pruning on contrastive learning methods | ICML | - | PyTorch(Author) | Image Classification | 2020 |
02 | Training Debiased Subnetworks with Contrastive Weight Pruning | CVPR | DCWP | - | Image Classification | 2023 |
No. | Title | Venue | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|
01 | FedDUAP: Federated Learning with Dynamic Update and Adaptive Pruning Using Shared Data on the Server | IJCAI | FedDUAP | - | Image Classification | 2020 |
02 | Model Pruning Enables Efficient Federated Learning on Edge Devices | TNNLS | - | PyTorch(Author) | Image Classification | 2022 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | Deep Rewiring: Training very Sparse Deep Networks | ICLR | - | Image Classification&Audio | 2018 |
02 | Co-Evolutionary Compression for Unpaired Image Translation | ICCV | PyTorch(Author) | Image Style Translation | 2019 |
03 | Content-Aware GAN Compression | CVPR | PyTorch(Author) | Image Style Translation | 2021 |
04 | Training Neural Networks with Fixed Sparse Masks | NeurIPS | PyTorch(Author) | Image Classification | 2021 |
05 | Vision Transformer Slimming: Multi-Dimension Searching in Continuous Optimization Space | CVPR | PyTorch(Author) | Image Classification&Audio | 2022 |
06 | SuperTickets: Drawing Task-Agnostic Lottery Tickets from Supernets via Jointly Architecture Searching and Parameter Pruning | ECCV | PyTorch(Author) | Image Classification&Object Detection&Human Pose Estimation | 2022 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | When BERT Plays the Lottery, All Tickets Are Winning | EMNLP | PyTorch(Author) | Language Modeling | 2020 |
02 | The Lottery Ticket Hypothesis for Pre-trained BERT Networks | ICML | PyTorch(Author) | Language Modeling | 2021 |
03 | Structured Pruning Learns Compact and Accurate Models | ACL | PyTorch(Author) | Natural Language Understanding | 2022 |
04 | A Fast Post-Training Pruning Framework for Transformers | NeurIPS | PyTorch(Author) | Natural Language Understanding | 2022 |
05 | A Fast Post-Training Pruning Framework for Transformers | NeurIPS | PyTorch(Author) | Natural Language Understanding | 2022 |
06 | The Optimal BERT Surgeon: Scalable and Accurate Second-Order Pruning for Large Language Models | EMNLP | PyTorch(Author) | Natural Language Understanding | 2022 |
07 | Pruning Meets Low-Rank Parameter-efficient | arXiv | - | Image Classification&Language Modeling | 2023 |
08 | LLM-Pruner: On the Structural Pruning of Large Language Models | arXiv | - | Language Modeling | 2023 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | Exploring Sparsity in recurrent neural networks | ICLR | PyTorch | Speech Recognition | 2017 |
02 | Deep Rewiring: Training very Sparse Deep Networks | ICLR | - | Image Classification&Audio | 2018 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | CLIP-Q: Deep Network Compression Learning by In-Parallel Pruning-Quantization | CVPR | - | Image Classification | 2018 |
02 | Accelerating Sparse Deep Neural Networks | arXiv | - | Image Classification&Object Detection&Language Translation&Language Modeling&Image Synthesis&Domain Translation&Style Transfer&Image-Image Translation&Super Resolution | 2021 |
03 | OPQ: Compressing Deep Neural Networks with One-shot Pruning-Quantization | AAAI | - | Image Classification | 2021 |
04 | Deep Model Compression Based on the Training History | arXiv | - | Image Classification | 2022 |
05 | LLM-Pruner: On the Structural Pruning of Large Language Models | arXiv | PyTorch | Causal Language Modeling | 2023 |
06 | Unified Data-Free Compression: Pruning and Quantization without Fine-Tuning | ICCV | - | Image Classification | 2023 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | Structured Pruning for Deep Convolutional Neural Networks: A survey | TPAMI | - | CV&NLP | 2024 |
02 | A survey on efficient vision transformers: algorithms, techniques, and performance benchmarking | arXiv | - | CV | 2024 |
03 | A Survey of Lottery Ticket Hypothesis | arXiv | - | CV&NLP | 2024 |
04 | Model Compression and Efficient Inference for Large Language Models: A Survey | arXiv | - | NLP | 2024 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | Why is the State of Neural Network Pruning so Confusing? On the Fairness, Comparison Setup, and Trainability in Network Pruning | arXiv | PyTorch(Author) | Image Classification | 2023 |
02 | Transforming Large-Size to Lightweight Deep Neural Networks for IoT Applications | ACM Computing Surveys | - | CV&NLP&Audio | 2023 |
03 | A Survey on Model Compression for Large Language Models | TACL | - | NLP&Unseen Instructions | 2023 |
04 | Towards Efficient Generative Large Language Model Serving: A Survey from Algorithms to Systems | arXiv | - | - | 2023 |
05 | A Survey on Dynamic Neural Networks for Natural Language Processing | arXiv | - | NLP | 2023 |
06 | Dimensionality Reduced Training by Pruning and Freezing Parts of a Deep Neural Network, a Survey | arXiv | - | CV&NLP | 2023 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | A Survey on Efficient Convolutional Neural Networks and Hardware Acceleration | Electronics | - | - | 2022 |
02 | Dimensionality Reduced Training by Pruning and Freezing Parts of a Deep Neural Network, a Survey | arXiv | - | Image Classification | 2022 |
03 | Efficient Transformers: A Survey | arXiv | - | CV&NLP | 2022 |
04 | Recent Advances on Neural Network Pruning at Initialization | IJCAI | - | CV&NLP | 2022 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks | JMLR | - | Image Classification | 2021 |
02 | Dynamic Neural Networks: A Survey | arXiv | - | - | 2021 |
03 | Pruning and Quantization for Deep Neural Network Acceleration: A Survey | Neurocomputing | - | Image Classification | 2021 |
04 | Compressing Large-Scale Transformer-Based Models: A Case Study on BERT | TACL | - | NLP | 2021 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | Model Compression and Hardware Acceleration for Neural Networks: A Comprehensive Survey | IEEE | - | - | 2020 |
02 | Pruning Algorithms to Accelerate Convolutional Neural Networks for Edge Applications: A Survey | arXiv | - | Image Classification | 2020 |
03 | A Survey of Model Compression and Acceleration for Deep Neural Networks | arXiv | - | - | 2020 |
04 | An Survey of Neural Network Compression | arXiv | - | - | 2020 |
05 | Convolutional Neural Network Pruning: A Survey | CCC | - | - | 2020 |
06 | What is the State of Neural Network Pruning? | MLSys | - | - | 2020 |
07 | A comprehensive survey on model compression and acceleration | Artificial Intelligence Review | - | - | 2020 |
08 | A Survey on Deep Neural Network Compression: Challenges, Overview, and Solutions | arXiv | - | - | 2020 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | Pruning Algorithms-A Survey | IEEE Transactions on Neural Networks | - | Image Classification | 1993 |
02 | Efficient Processing of Deep Neural Networks: A Tutorial and Survey | arXiv | - | Image Classification | 2017 |
03 | Recent advances in efficient computation of deep convolutional neural networks | arXiv | - | - | 2018 |
04 | The State of Sparsity in Deep Neural Networks | arXiv | PyTorch(Author) | Image Classification&machine translation | 2019 |
No. | Title | Venue | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|
01 | Is Pruning Compression?: Investigating Pruning Via Network Layer Similarity | WACV | - | - | Image Classification | 2020 |
02 | A Gradient Flow Framework For Analyzing Network Pruning | ICLR | - | PyTorch(Author) | Image Classification | 2021 |
03 | Data Level Lottery Ticket Hypothesis for Vision Transformers | IJCAI | - | PyTorch(Author) | Image Classification | 2021 |
04 | Are All Layers Created Equal? | JMLR | - | - | Image Classification | 2022 |
https://github.com/airaria/TextPruner
We would like to express our gratitude to the authors of the articles cited in our survey and the authors of the following repositories.
https://github.com/he-y/awesome-Pruning/
https://github.com/MingSun-Tse/Awesome-Pruning-at-Initialization
https://github.com/csyhhu/Awesome-Deep-Neural-Network-Compression/blob/master/Paper/Pruning.md
If you find this project useful, please cite
@article{cheng2023survey,
title={A Survey on Deep Neural Network Pruning:Taxonomy, Comparison, Analysis, and Recommendations},
author={Hongrong Cheng and Miao Zhang and Javen Qinfeng Shi},
journal={arXiv preprint arXiv:2308.06767},
year={2023}
}