Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Errors with adapter #62

Open
351246241 opened this issue Nov 24, 2023 · 0 comments
Open

Errors with adapter #62

351246241 opened this issue Nov 24, 2023 · 0 comments

Comments

@351246241
Copy link

351246241 commented Nov 24, 2023

Thank you for your awesome work.
But I run the test.py found a error:
RuntimeError: Sizes of tensors must match except in dimension 1. Expected size 1024 but got size 768 for tensor number 1 in the list.
It seems the shape of results of clip encoder and vae encoder are wrong.
How can I fix it?

"
def forward(self, clip, vae):
# clip (1 257 1024)
vae = self.pool(vae) # 1 4 80 64 --> 1 4 40 32
vae = rearrange(vae, 'b c h w -> b c (h w)') # 1 4 40 32 --> 1 4 1280
vae = self.vae2clip(vae) # 1 4 768
# Concatenate them is difficult
concat = torch.cat((clip, vae), 1)
"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant