ML Infrastructure | Computational Graphs | Backpropagation Algorithm
Modern AI product development relies heavily on high-level frameworks like PyTorch and TensorFlow. However, relying on these "black boxes" without understanding the underlying calculus leads to inefficient model design and poor debugging of vanishing/exploding gradients.
TorchLite is a custom-built Deep Learning Framework engineered from scratch using raw NumPy. It mimics the architecture of PyTorch, implementing the core Automatic Differentiation (AutoGrad) engine, forward/backward propagation pipelines, and optimization algorithms. It serves as a lightweight, educational inference engine to demonstrate the mathematical foundations of neural networks.
- Abstraction Opacity: Engineers often treat model training as magic. When a model fails to converge, they lack the low-level intuition to diagnose the math.
- Overhead: Standard libraries are bloated with legacy code. Understanding the minimal viable requirements for a neural network is crucial for Edge AI and custom hardware implementations.
I reverse-engineered the core components of a DL library to build a modular, extensible training system.
| Component | Technical Implementation | PM Value Proposition |
|---|---|---|
| Computational Graph | Forward/Backward Pass |
Manually implemented the chain rule for gradient propagation, ensuring exact mathematical precision. |
| Layer Abstraction | Linear / Dense Classes |
Created modular objects that hold state (weights/biases), allowing for "Lego-like" model assembly. |
| Activation Logic | ReLU, Sigmoid, Tanh |
Implemented non-linearities and their derivatives to enable the network to learn complex patterns. |
| Optimization | SGD (Stochastic Gradient Descent) |
Built the weight update logic to minimize loss functions (MSE, Cross Entropy) over time. |
Unlike using an API, this project handles the raw matrix math:
graph LR
A[Input Vector] --> B(Linear Transform Wx+b)
B --> C{Activation ReLU}
C --> D(Output Layer)
D --> E[Loss Calculation]
E -- Backpropagation (Chain Rule) --> A