MNIST written digit recognition (+ various other demos) trained using a DIY ML framework with TensorFlow-inspired API
Currently, the framework supports only linear architecture with dense layers.
The full list of features:
| Type | Features |
|---|---|
| Layers | Dense, Activation |
| Architectures | linear |
| Optimizers | Random, SGD with momentum, Adam |
| Losses | MSE, MAE, Categorical Crossentropy |
| Metrics | MSE, MAE, Accuracy |
| Activations | Linear, ReLU, LeakyReLU, Sigmoid, Tanh, Sine |
| Dataset ops. | shuffling, even/uneven batching |
Features that will likely be added in the future: (in order of importance)
- Dropout layer
- Conv2D and Flatten layers
- validation datasets
- non-linear architecture with Concatenate layer
- gradient clipping
- Huber loss
- AdamW optimizer
The framework itself does not rely on anything else than Numpy, Tqdm and standard python packages.
Additional dependencies are required for launching demos:
- Pillow
- Matplotlib
To launch any of the demos, simply clone the repository and execute them:
git clone https://github.com/mat-kubiak/manual-mnist.git
cd manual-mnist
# (browse `demos` dir for more demos)
python3 -m demos.sin_functionTrains a network to recreate a
Code: demos/sin_function.py
sin-function.mp4
Trains a network to recreate a chosen gray-scale or rgb image of any dimensions from two coordinates x and y:
Code: demos/image_from_coordinates.py and demos/image_from_coordinates_rgb.py
Example gray-scale image (colored with cmap):
image-lenna-30fps.mp4
Example rgb image:
image-rgb-30fps.mp4
This project is licensed under the terms of the MIT license. See LICENSE for more info.