Skip to content

如何设置learning rate decay?Annealing the learning rate  #1167

Closed
@OleNet

Description

@OleNet

optimizers.py里面,我看到了设置learning rate decay的参数, 不过有两个,learning_rate_decay_a=0.,
learning_rate_decay_b=0.,
请问这两个参数有什么区别呢、分别代表什么含义呢?我应该用哪一个呢?
似乎并没有相关的wiki、文档记录呢。

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions