Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Passing max_detection_thresholds to MeanAveragePrecision breaks the calculation #1795

Closed
aamster opened this issue May 21, 2023 · 3 comments
Closed
Labels
bug / fix Something isn't working help wanted Extra attention is needed

Comments

@aamster
Copy link

aamster commented May 21, 2023

🐛 Bug

Passing max_detection_thresholds other than 100 to MeanAveragePrecision causes .compute to return -1 for map

To Reproduce

Take the example at https://github.com/Lightning-AI/torchmetrics/blob/master/examples/detection_map.py but pass max_detection_thresholds=[50] to MeanAveragePrecision

Expected behavior

it should not return -1

@aamster aamster added bug / fix Something isn't working help wanted Extra attention is needed labels May 21, 2023
@bchretien
Copy link

Duplicate of #1153 it seems.

@SkafteNicki
Copy link
Member

Hi @aamster,
Could you try running from master?

pip install https://github.com/Lightning-AI/torchmetrics/archive/master.zip

To me it seems the issue was fixed by this PR: #1712, because running the example as you are proposing seems to be producing the right answer.

@aamster
Copy link
Author

aamster commented May 22, 2023

@SkafteNicki it is fixed on master. I will close this issue then

@aamster aamster closed this as completed May 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

4 participants
@bchretien @aamster @SkafteNicki and others