This project implements various optimization algorithms using only NumPy in Python. The implemented algorithms include:
- Momentum
- AdaGrad
- RMSProp
- Adam
Additionally, the Broyden-Fletcher-Goldfarb-Shanno (BFGS) optimizer is also implemented. The project aims to conduct a comparative analysis of the results obtained from the BFGS optimizer and those obtained from using Adam.
The project involves the following steps:
- Implementing the optimization algorithms using NumPy.
- Implementing the BFGS optimizer.
- Conducting experiments to compare the results obtained from the BFGS optimizer and those obtained from using Adam.
- Analyzing the results and drawing conclusions.
Through this project, we hope to gain a deeper understanding of optimization algorithms and their performance in various scenarios.