This repository contains code for the paper Effective and Efficient Dropout for Deep Convolutional Neural Networks. Customizable and effective Dropout blocks have been proposed to support complex analytics with Convolutional Neural Networks.
The illustration of convolutional transformations with 4 structural levels of dropout:
- Dropout, or drop-neuron, gates input neurons in operation 1;
- Drop-channel replaces identity mapping in operation 2 with operation 3, random sampling and gating on channels;
- Drop-path is introduced to F conv in operation 4;
- Drop-layer to the shortcut connection in operation 5.
The illustration of the example proposed building block:
- example models (/models)
- codes for dropout training (train.py)
- codes to support different structural levels of dropout (models/convBlock.py)
- supporting effective dropout with customizable building blocks (models/convBlock/conv_block)
* python 3.7.3
* pytorch 1.2.0
* torchvision 0.4.0
Example training code:
CUDA_VISIBLE_DEVICES=0 python train.py --net_type=resnet --depth 110 --arg1 1 --epoch 164 --weight_decay 1e-4 --block_type 0 --drop_type=1 --drop_rate=0.1 --exp_name resnet_dropChannel --report_ratio
Please check help info in argparse.ArgumentParser (train.py) for more details
To ask questions or report issues, please open an issue here or can directly send us an email.