Skip to content

Conversation

@stas00
Copy link
Contributor

@stas00 stas00 commented Jan 8, 2021

This PR rounds very long fractions in trainer state e.g.,

{'loss': 14.846837043762207, 'learning_rate': 6e-06, 'epoch': 0.3333333333333333} 

to:

  • epoch 2 decimals
  • loss 4 decimals

resulting in:

{'loss': 14.8468, 'learning_rate': 6e-06, 'epoch': 0.33}

If you want any other small tweaks for me to add please let me know.

Fixes: #9475

@sgugger

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@LysandreJik LysandreJik requested a review from sgugger January 11, 2021 11:18
@stas00 stas00 merged commit e6f211c into huggingface:master Jan 11, 2021
@stas00 stas00 deleted the trainer-epoch branch January 11, 2021 18:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[trainer] fractional epoch

3 participants