A simple implementation of Linear Regression using just NumPy, written in Python and hosted on Google Colab ☁️.
Perfect for understanding the basics of machine learning and how linear models work under the hood.
- 📊 Pure Python + NumPy implementation
- 🧠 Gradient Descent-based training
- 📉 Visualization of predictions
- 💡 Great for learning ML fundamentals
We aim to fit a linear model:
ŷ = w * x + b
Where:
ŷis the predicted outputwis the weight (slope)bis the bias (intercept)
J(w, b) = (1/n) * Σ (ŷᵢ - yᵢ)²
Where:
nis the number of data pointsŷᵢis the predicted valueyᵢis the actual value
w = w - α * ∂J/∂w b = b - α * ∂J/∂b
Where α is the learning rate.
Click below to open the notebook in Colab:
Made with ❤️ by CraftyEngineer
This project is open source and available under the MIT License.