This repository was archived by the owner on Feb 7, 2025. It is now read-only.
This repository was archived by the owner on Feb 7, 2025. It is now read-only.
Unused projection layer in AttentionBlock #503
Open
Description
After wrapping a MONAI AutoencoderKL model in pytorch lightning, I got an error regarding an unused parameter in the custom AttentionBlock implementation.
diffusion_model_unet.py and autoencoderkl.py line 233:
self.proj_attn = nn.Linear(num_channels, num_channels)
To me, it looks like this layers is completely unused. I cannot tell whether it is a mistake that this parameter is unsued, or if it should have never been defined in the first place.