Please kindly note that some mathematical expressions might not be properly rendered in the Github online IPython viewer. Please use nbviewer
The main objective of this set of notebooks is to implement Logistic Regression from the scratch. We start this by deriving negative log likelihood of the logistic regression. Then, in order to create logistic regression algorithm, negative log likelihood function will be optimized using a simple optimization algorithm called gradient descent. Additionally, we will discuss a lot of related details such as regularization, handing large dataset and much more.
In the later part of each notebook, we show how to use our simple logistic regression implementation to build predictive models for real-world applications using actual datasets. So key features of these notebooks are:
- Derive negative log likelihood of the logistic regression algorithm.
- Build logistic regression model from the scratch using Python/Numpy.
- Basic building blocks (written as Python functions) of the logistic regression model will be tested using unit tests.
- Run the logistic regression algorithm against a simple 2D dataset.
- Explore regularization.
- Handling large datasets using mini-batch gradient descent.
- Apply logistic regression algorithm for practical datasets.
This project requires python 2.7 and following Python libraries installed.
In a terminal or command window, navigate to the top-level project directory logistic_regression/(that contains this README) and run the following commands:
ipython notebook logistic_regression_[1|2].ipynb
This will open the IPython Notebooks file in your browser.
Please kindly note that some mathematical expressions might not be properly rendered in the Github online IPython viewer.