Skip to content

[IJCAI'19] Nostalgic Adam: Weighting more of the past gradients when designing the adaptive learning rate

Notifications You must be signed in to change notification settings

andrehuang/NostalgicAdam-NosAdam

Repository files navigation

Nostalgic Adam

Code and supplements for "Nostalgic Adam: Weighting more of the past gradients when designing the adaptive learning rate"

Haiwen Huang, Chang Wang, Bin Dong (http://bicmr.pku.edu.cn/~dongbin/Publications/NosAdam.pdf)

Dependencies: Python >= 3.5, Pytorch >= 0.4.0

An introduction to the paper in Chinese: https://zhuanlan.zhihu.com/p/65625686

If you find this code useful, please cite:

@inproceedings{ijcai2019-355,
  title     = {Nostalgic Adam: Weighting More of the Past Gradients When Designing the Adaptive Learning Rate},
  author    = {Huang, Haiwen and Wang, Chang and Dong, Bin},
  booktitle = {Proceedings of the Twenty-Eighth International Joint Conference on
               Artificial Intelligence, {IJCAI-19}},
  publisher = {International Joint Conferences on Artificial Intelligence Organization},             
  pages     = {2556--2562},
  year      = {2019},
  month     = {7},
  doi       = {10.24963/ijcai.2019/355},
  url       = {https://doi.org/10.24963/ijcai.2019/355},
}

About

[IJCAI'19] Nostalgic Adam: Weighting more of the past gradients when designing the adaptive learning rate

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published