Skip to content
This repository was archived by the owner on Feb 7, 2025. It is now read-only.
This repository was archived by the owner on Feb 7, 2025. It is now read-only.

Unused projection layer in AttentionBlock #503

Open
@ChristianHinge

Description

@ChristianHinge

After wrapping a MONAI AutoencoderKL model in pytorch lightning, I got an error regarding an unused parameter in the custom AttentionBlock implementation.

diffusion_model_unet.py and autoencoderkl.py line 233:

self.proj_attn = nn.Linear(num_channels, num_channels)

To me, it looks like this layers is completely unused. I cannot tell whether it is a mistake that this parameter is unsued, or if it should have never been defined in the first place.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions