Closed
Description
🐛 Bug
MeanIoU scored 56(!) over validation dataset
Let's look at the source code:
def update(self, preds: Tensor, target: Tensor) -> None:
"""Update the state with the new data."""
intersection, union = _mean_iou_update(preds, target, self.num_classes, self.include_background)
score = _mean_iou_compute(intersection, union, per_class=self.per_class)
self.score += score.mean(0) if self.per_class else score.mean()
def compute(self) -> Tensor:
"""Update the state with the new data."""
return self.score # / self.num_batches
There are several issues there:
self.score
is accumulated with each call ofupdate
method.compute
method just returns the accumulated score- description of
compute
method is copypasted fromupdate
method - division by
self.num_batches
is commented num_batches
is definded on class level and not used anywhere else
Obviously, that code was neither reviewed nor tested, but somehow was released.
To Reproduce
Call metric.update(y_hat, y)
in validation_step
Log the metric in on_validation_epoch_end
Expected behavior
MeanIoU computes correct value in [0, 1] range.
Environment
- TorchMetrics 1.4.0.post0 from pip
- Python 3.11.9, PyTorch 2.3.0+cu121
- pytorch-lightning 2.2.4
- Windows 11