A simple, lightweight neural network library built from scratch using only pure NumPy.
To get started, clone the repository and install the package using pip. This will also handle the dependencies listed in pyproject.toml.
# clone the repo
git clone https://github.com/henok3878/neuralnet-from-scratch.git
cd neuralnet-from-scratch
# install the package in editable mode
pip install -e .Here’s a quick example of how to define a model, load the MNIST dataset, and train it to recognize handwritten digits.
First, you need to download the MNIST data files and place them in the data/MNIST directory at the root of this repository.
You can download the files from Yann LeCun's MNIST page:
import numpy as np
from neuralnet.models import MLP
from neuralnet.datasets import load_mnist
from neuralnet.utils.data_processing import one_hot_encode
from neuralnet.metrics import accuracy
# loading the dataset
(X_train, y_train), (X_test, y_test) = load_mnist(dir_path='./data/MNIST')
y_train_encoded = one_hot_encode(y_train, num_classes=10)[0]
# defining model
model = MLP(
layers_size=[784, 128, 10], # input, hidden, and output layer sizes
activation='sigmoid',
output_activation='softmax',
cost_function='categorical_cross_entropy',
optimizer='gradient_descent',
learning_rate=2.0
)
# training
print("Starting model training...")
model.train(
X_train,
y_train_encoded,
epochs=40,
batch_size=64,
log_interval=10
)
print("Training completed.")
# evaluation
y_pred = model.predict(X_test)
acc = accuracy(y_test, y_pred)
print(f"Model Accuracy on Test Set: {acc:.4f}")
# save the trained model
model.save("mlp_mnist.pkl")src/
└── neuralnet/
├── __init__.py # public API entry point
├── components/ # core components (activations, losses)
├── datasets/ # data loading utilities
├── layers/ # network layers
├── metrics.py # eval functions (accuracy)
├── models/ # models (ex: MLP)
├── optimizers/ # optimization algorithms
└── utils/ # helper functions
MIT