Skip to content

Conversation

vfdev-5
Copy link
Collaborator

@vfdev-5 vfdev-5 commented Apr 14, 2022

Description:

  • Removed warning in DDP if Metric.reset/update are not decorated.

Context:
reinit__is_reduced and sync_all_reduce decorators are optional helpers to compute full metric value in distributed context. They should be optional and users basically are free to skip them and perform all_reduce-like ops on accumulated variable during compute on their own. And also make sure that compute wont perform collective ops twice if no reset/update were called. reinit__is_reduced decorator is needed to avoid multiple reductions if compute method is called several times without reseting or updating...
If user develop their own metric and prefer to all_reduce with their means, they see a warning in DDP.

Check list:

  • New tests are added (if a new feature is added)
  • New doc strings: description and/or example code are in RST format
  • Documentation is updated (if required)

@github-actions github-actions bot added the module: metrics Metrics module label Apr 14, 2022
@vfdev-5 vfdev-5 merged commit 0d40173 into pytorch:master Apr 16, 2022
@vfdev-5 vfdev-5 deleted the metric-rm-decor-warning branch April 16, 2022 08:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: metrics Metrics module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant