Skip to content

Code and documentation for my research and development project (A Comparative Study of Sparsity Methods in Deep Neural Network for Faster Inference)

License

Notifications You must be signed in to change notification settings

desinurch/Compression_RnD_HBRS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Research and Development Project HBRS

A Comparative Study of Sparsity Methods in Deep Neural Network for Faster Inference

Code and documentation for research and development project with a topic in Deep Neural Network Compression as partial fulfillmrnt in Masters of Autonomous Systems program.

Repository structure:

- L1 norm pruning: cloned from https://github.com/Eric-mingjie/rethinking-network-pruning/tree/master/cifar/l1-norm-pruning with minor modifications. Based on the implementation of the paper "[Pruning Filters For Efficient ConvNets][https://arxiv.org/pdf/1608.08710.pdf]"
- Weight level pruning : cloned from https://github.com/Eric-mingjie/rethinking-network-pruning/tree/master/cifar/l1-norm-pruning with minor modifications. Based on the implementation of the paper "[Pruning Filters For Efficient ConvNets][https://arxiv.org/pdf/1608.08710.pdf]"
- Knowledge Distillation methods : 
	- Cloned from https://github.com/peterliht/knowledge-distillation-pytorch with minor modifications. Based on the implementation of the paper "[Distilling the Knowledge in a Neural Network][https://arxiv.org/pdf/1503.02531.pdf]"
	- FitNets implementation. Cloned from https://github.com/AberHu/Knowledge-Distillation-Zoo with modifications. Based on the implementation of the paper "[FitNets: Hints for Thin Deep Nets][https://arxiv.org/pdf/1412.6550.pdf]"
- Report of research and development project

About

Code and documentation for my research and development project (A Comparative Study of Sparsity Methods in Deep Neural Network for Faster Inference)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published