Skip to content

Conversation

@xgal
Copy link
Contributor

@xgal xgal commented Aug 21, 2024

What does this PR do?

Fixes # (issue)
Can't use Jamba in FSDP due to A init as FP32 no matter the weights and dtype passed. this behave isn't relevant cause the weights are BF16 and in fwd pass we anyway dtyping when needed as in mamba_2_modeling

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch!

@ArthurZucker ArthurZucker merged commit 6baa6f2 into huggingface:main Aug 22, 2024
ArthurZucker pushed a commit that referenced this pull request Aug 22, 2024
Co-authored-by: Gal Cohen <galc@ai21.com>
BernardZach pushed a commit to BernardZach/transformers that referenced this pull request Dec 5, 2024
Co-authored-by: Gal Cohen <galc@ai21.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants