Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SSIM calculation mistake. #2718

Closed
petertheprocess opened this issue Sep 3, 2024 · 4 comments · Fixed by #2721
Closed

SSIM calculation mistake. #2718

petertheprocess opened this issue Sep 3, 2024 · 4 comments · Fixed by #2721
Assignees
Labels
bug / fix Something isn't working
Milestone

Comments

@petertheprocess
Copy link
Contributor

https://github.com/Lightning-AI/torchmetrics/blob/a31417ca1bc4a47ca093c22e3f2f2b5e6843d58a/src/torchmetrics/functional/image/ssim.py#L164C1-L170C72

Before the convolution, we already applied same padding to the image

        preds = F.pad(preds, (pad_w, pad_w, pad_h, pad_h), mode="reflect")
        target = F.pad(target, (pad_w, pad_w, pad_h, pad_h), mode="reflect")

So ssim_idx_full_image will share the same shape as original image, there is no need to take the padding out again, otherwise it will result in losing information on edge.

    ssim_idx_full_image = ((2 * mu_pred_target + c1) * upper) / ((mu_pred_sq + mu_target_sq + c1) * lower)

    if is_3d:
        ssim_idx = ssim_idx_full_image[..., pad_h:-pad_h, pad_w:-pad_w, pad_d:-pad_d]
    else:
        ssim_idx = ssim_idx_full_image[..., pad_h:-pad_h, pad_w:-pad_w]
Copy link

github-actions bot commented Sep 3, 2024

Hi! thanks for your contribution!, great first issue!

@petertheprocess
Copy link
Contributor Author

petertheprocess commented Sep 3, 2024

Also, the kernal_size is a input parameter while gauss_kernel_size is calculated based on input parameter sigma. If we call the function without gaussian kernel, the padding size will still calculated based on gauss_kernel_size rather than kernal_size.

    gauss_kernel_size = [int(3.5 * s + 0.5) * 2 + 1 for s in sigma]

    pad_h = (gauss_kernel_size[0] - 1) // 2
    pad_w = (gauss_kernel_size[1] - 1) // 2

@petertheprocess
Copy link
Contributor Author

Hello, if these issues were not deliberately designed for other reasons, I would be happy to open a PR to fix them. I look forward to your feedback. Thank you!

@SkafteNicki
Copy link
Member

Hi @petertheprocess, thanks for raising this issue.
It does seem like both are errors. Feel free to create a PR correcting this :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants