Paper-Implementation-Overview-Gradient-Descent-Optimization-Algorithms arXiv paper : "An Overview of Gradient Descent Optimization Algorithms" - Sebastian Ruder Python 2.7 Links to original paper published on arXiv.org>cs>arXiv:1609.04747 : [1], [2] Implemented following Gradient Desent Optimization Algorithms from Scratch: Vanilla Batch/Stochastic Gradient Descent [3] Momentum [4] [5] Nesterov Accelarated Gradient [6] Adagrad [7] Adadelta [8] RMS Prop [9]