Dropout vs. batch normalization: effect on accuracy, training and inference times - code for the paper
-
Updated
Dec 19, 2022 - TeX
Dropout vs. batch normalization: effect on accuracy, training and inference times - code for the paper
Machine Learning Practical - Coursework 1 Report: a study of the problem of overfitting in deep neural networks, how it can be detected, and prevented using the EMNIST dataset. This was done by performing experiments with depth and width, dropout, L1 & L2 regularization, and Maxout networks.
Add a description, image, and links to the dropout topic page so that developers can more easily learn about it.
To associate your repository with the dropout topic, visit your repo's landing page and select "manage topics."