Implementations of several machine learning algorithms in python using numpy.
The following notebooks are included:
-
- Simple Least Squares
- Oridnary Least Squares
- Bayesian Linear Regression
- Least Mean Squares
-
- Perceptron Learning Algorithm
- Logistic Regression
- Naive Bayes Classifier
- Support Vector Machine (Not implemented from scratch)
-
Multinomial & Gaussian Naive Bayes
- Gaussian Naive Bayes (Clone from Linear Classification)
- Multinomial Naive Bayes
- Decision Tree and Random Forest
- ID3 Algorithm
- Random Forest
-
- K-Means
- K Nearest Neighbours
- Brute Force KNN
- KD-Tree
- Kernels to Compute Weight: ID_weight, Epanechnikov & Tricube
- Expectation Maximisation for Gaussian Mixture Model
-
K-Means and Expectation-Maximisation for Gaussian Mixture Model (Viz) (clone from unsupervised learners)
- Visual Comparison of K-Means vs. EM for GMM Using 2-Dimensional MNIST Data Reduced by PCA & Synthetic Data from N Gaussian Distributions.
-
K-Nearest Neighbors (KNN) for classification (iris dataset) (clone from unsupervised learners)
The code is provided in Jupyter Notebook format (.ipynb), which you can either view directly on GitHub or download and run on your local machine. Each notebook contains clear implementations of the algorithms, along with the relevant formulas and pseudo code for that algorithm. For better understanding of the algorithms, check out the specific notebook of interest.
- Pattern Recognition and Machine Learning by Christopher Bishop
- Machine Learning: An Algorithmic Perspective, Second Edition by Stephen Marsland