We introduce Limited Accelerated Gradient (LAG), a first-order optimization method for training gradient-based machine learning algorithms that is intuitively designed to resist the noise data. Comparing to recent first-order optimization methods such as ADAM and its variants, the method is easy to tune and efficient to run given it is requiring less hyperparameters. While maintaining its simplicity, it shows an outstanding performance and achieves considerable improvement against all strong baselines in empirical results. We are also providing an analysis for the optimization landscape of the method and summarizing some insights for first-order optimization on Deep Neural Networks from its approximation trajectory.
-
Notifications
You must be signed in to change notification settings - Fork 0
liyt96/LAG_Optim
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
Limited Accelerated Gradient Optimizer
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published