Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

learning rate scheduler addded #3649

Closed
wants to merge 2 commits into from
Closed

learning rate scheduler addded #3649

wants to merge 2 commits into from

Conversation

soroushhashemifar
Copy link

No description provided.

@ftyers
Copy link
Collaborator

ftyers commented May 17, 2021

Thanks @soroushhashemifar could you give some details about how you imagine this should be used and how the "reduce on plateau" functionality doesn't solve this already?

@soroushhashemifar
Copy link
Author

Thanks for your attention @ftyers actually I faced the problem while I was training DeepSpeech for Persian language. Reduce on Plateau may drive the model to higher points of the parameters space, because it takes time to detect the plateau. While scheduling learning rate according to epoch makes more sense, because as training goes forward, we need less oscillation in parameters and thus reducing learning rate can help a lot in this approach.
On the other hand, this feature is provided in famous frameworks, e.g. Tensorflow and PyTorch, so I was thinking why not in DeepSpeech?!
At last, it is easy to work with the parameter because of its pythonic format. It is a compact form of nested if else blocks while because it will get evaluated during the running, you are able to use any python function inside the learning rate scheduler parameter.

@soroushhashemifar soroushhashemifar closed this by deleting the head repository Jul 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants