Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Raise an error if sync_dist is set when logging torchmetrics #15588

Open
carmocca opened this issue Nov 8, 2022 · 3 comments
Open

Raise an error if sync_dist is set when logging torchmetrics #15588

carmocca opened this issue Nov 8, 2022 · 3 comments
Assignees
Labels
feature Is an improvement or enhancement logging Related to the `LoggerConnector` and `log()` pl Generic label for PyTorch Lightning package
Milestone

Comments

@carmocca
Copy link
Contributor

carmocca commented Nov 8, 2022

Proposal

Re-do #13903


Context

    > When using torchmetrics, it will sync whether you want it or not.

True, sorry, I noticed this later. We can make it None by default.

One can chose whether it happens additionally on step or not.

we can determine that based on on_step and on_epoch if sync_dist!=None:

  • If either just on_step=True or just on_epoch=True, set the flag inside torchmetrics accordingly
  • If on_step=True and on_epoch=True:
    • Set both the flags
    • Set only sync_on_compute to align it with how we handle scalar metrics.
    • Raise an error to ask user to set it by themselves, since we can't determine if they want to sync on_step, on_epoch or on both.

perhaps some of these checks are already done internally, but I wouldn't know where to look.

if you are talking about:

  • in case torchmetrics are logged, what happens when sync_dist=True? It is ignored
  • what happens on step level if sync_dist=True. Nothing.

Originally posted by @rohitgr7 in #13903 (comment)

cc @Borda @carmocca @Blaizzy

@carmocca carmocca added feature Is an improvement or enhancement logging Related to the `LoggerConnector` and `log()` pl Generic label for PyTorch Lightning package labels Nov 8, 2022
@carmocca carmocca added this to the future milestone Nov 8, 2022
@shenoynikhil
Copy link
Contributor

I'll take this up.

@justusschock
Copy link
Member

@shenoynikhil how is it going here?

@shenoynikhil
Copy link
Contributor

@justusschock Will attend to it this weekend.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement logging Related to the `LoggerConnector` and `log()` pl Generic label for PyTorch Lightning package
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants