This notebook focuses on the nature of optimisation approaches used for deep learning and create the model from scratch using JAX and demonstrate the symbolic nature of equation of the expression using Sympy.
The main purpose of this project is to understand about the nature of two variants of optimisers that is gradient descent and newton's second moment update using hessian matrix.To compare this a minimum viable model is produced from scratch based on JAX and various demonstration of processes involved in deep learning are explored.
- Utilization of JAX for efficient numerical optimization.
- Sympy's symbolic mathematics for clear formulation and demonstration.
- Testing and comparison of simple gradient descent and Newton's second moment.
- Using visualisation to demonstrate the complex nature of the optimisation process .
- Explore the less popular approach of newton second moment update with the more popular used gradient descent algorithm of optimisation