Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prefix clash with MetricCollection #2065

Closed
abcamiletto opened this issue Sep 8, 2023 · 1 comment · Fixed by #2070
Closed

Prefix clash with MetricCollection #2065

abcamiletto opened this issue Sep 8, 2023 · 1 comment · Fixed by #2070
Assignees
Labels
bug / fix Something isn't working help wanted Extra attention is needed v1.1.x

Comments

@abcamiletto
Copy link

abcamiletto commented Sep 8, 2023

🐛 Bug

There is a bug that happens with metrics that returns dictionaries when used in MetricCollection.

To Reproduce

Steps to reproduce the behavior...

import torch
from torchmetrics import Metric, MetricCollection

class CustomAccuracy(Metric):
    def __init__(self):
        super().__init__()
        self.prefix = 'accuracy'
    
    def update(self, preds: torch.Tensor, target: torch.Tensor) -> None:
        self.correct = torch.sum(preds == target)
        self.total = preds.numel()
        

    def compute(self) -> torch.Tensor:
        res = self.correct.float() / self.total
        return {f"{self.prefix}/value": res}

class CustomPrecision(Metric):
    def __init__(self):
        super().__init__()
        self.prefix = 'precision'
        
    def update(self, preds: torch.Tensor, target: torch.Tensor) -> None:
        self.true_positives = torch.sum((preds == target) & (preds == 1))
        self.predicted_positives = torch.sum(preds == 1)

    def compute(self) -> torch.Tensor:
        res =  self.true_positives.float() / self.predicted_positives
        return {f"{self.prefix}/value": res}

# Initialize MetricCollection with Accuracy and Precision
metrics = MetricCollection([CustomAccuracy(), CustomPrecision()])

# Mock predictions and targets
preds = torch.tensor([1, 0, 0, 1])
targets = torch.tensor([1, 0, 0, 0])

# Update metrics with current batch
metrics(preds, targets)

# Print the calculated metrics
print(metrics.compute())
# Returns {'precisionaccuracy/value': tensor(0.7500), 'precisionprecision/value': tensor(0.5000)}
Code sample

Expected behavior

I expect the results to be

{'accuracy/value': tensor(0.7500), 'precision/value': tensor(0.5000)}

and not

{'precisionaccuracy/value': tensor(0.7500), 'precisionprecision/value': tensor(0.5000)}

Environment

  • TorchMetrics version (and how you installed TM, e.g. conda, pip, build from source): 1.1.1, via pip
  • Python & PyTorch Version: 3.10 and 1.13
  • Any other relevant information such as OS (e.g., Linux):

Additional context

Please note that this works correctly in torchmetrics 0.11.4

@abcamiletto abcamiletto added bug / fix Something isn't working help wanted Extra attention is needed labels Sep 8, 2023
@github-actions
Copy link

github-actions bot commented Sep 8, 2023

Hi! thanks for your contribution!, great first issue!

@Borda Borda changed the title Prefix clash with MetricCollection with 1.1.1 Prefix clash with MetricCollection Sep 8, 2023
@Borda Borda added the v1.1.x label Sep 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed v1.1.x
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants