Skip to content

Commit

Permalink
some reference-freeing I missed from @neonsecret's optimized attention
Browse files Browse the repository at this point in the history
…CompVis#177

plus some extras from me (del v, del h)
  • Loading branch information
Birch-san committed Sep 11, 2022
1 parent 91d29c2 commit 18bb5f8
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions ldm/modules/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -191,9 +191,12 @@ def forward(self, x, context=None, mask=None):

# attention, what we cannot get enough of
attn = sim.softmax(dim=-1)
del sim

out = einsum('b i j, b j d -> b i d', attn, v)
del attn, v
out = rearrange(out, '(b h) n d -> b n (h d)', h=h)
del h
return self.to_out(out)


Expand Down

0 comments on commit 18bb5f8

Please sign in to comment.