Skip to content

liyt96/LAG_Optim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

48 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Limited Accelerated Gradient Optimizer (LAG)

We introduce Limited Accelerated Gradient (LAG), a first-order optimization method for training gradient-based machine learning algorithms that is intuitively designed to resist the noise data. Comparing to recent first-order optimization methods such as ADAM and its variants, the method is easy to tune and efficient to run given it is requiring less hyperparameters. While maintaining its simplicity, it shows an outstanding performance and achieves considerable improvement against all strong baselines in empirical results. We are also providing an analysis for the optimization landscape of the method and summarizing some insights for first-order optimization on Deep Neural Networks from its approximation trajectory.

About

Limited Accelerated Gradient Optimizer

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published