Skip to content

Conversation

patrickvonplaten
Copy link
Contributor

In this PR: #8518 a bug was fixed that removed an unnecessary weight from the T5 Cross Attention layer.
In the following, this layer was added to the wrong "ignore_weight" list. This weight will never be missing since it doesn't exist in the model anymore it can only be "not used" since it's still present in saved checkpoints. This PR fixes the incorrect warning by placing the regex in the correct list.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant