Simple ConvNet classifying MNIST data
This is supposed to be a little test project. I want to play around with convolutional layers, pooling layers, normalization strategies (dropout, batch normalization), training algorithms (Vanilla SGD, SGD w. Momentum, etc.) and much more.
- Dataset preparation
- Simple weight initialization
- Advanced weigth initialization (Xavier initialization, etc.)
- Convolution-Function
- Pooling Layers (Max-Pooling, Average-Pooling, etc.)
- Dropout
- Batch Normalization
- Activation-Function (ReLU)
- Loss-Function (Cross Entropy)
- Gradient-Computation Function
- Stochastic Mini Batch Gradient Descent
- Advanced SGD (Momentum, RMSprop, Adam, etc.)
- J/epoch-Graph
- Graphical representation of convolutional Layers
- Prediction-Function
- Model evaluation (Accuracy)
- ... probably more to come ...
Best accuracy so far: 93.14%
J/Epoch-Graph over 1024 iterations ...
Convolutional weights & activations (examples: 8, 5)
... MattMoony (August, 2019)