-
Notifications
You must be signed in to change notification settings - Fork 26.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Tests
] Fix failing 8bit test
#25564
[Tests
] Fix failing 8bit test
#25564
Conversation
The documentation is not available anymore as the PR was closed or merged. |
Hi @younesbelkada ! Could you tell a bit more why MPT needs |
Also you probably need refresh your CircleCI token? |
for MPT as you can see from the remote model, einops is used in files that are imported in |
OK, I think I got confused by the fact we have a modeling file and we also use the same checkpoint name in our model tests. Now I get it, but a question: why we want to use the remote code rather than the code in |
Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for explaining. Let's try it!
I am not sure what's the original purpose of testing the modeling code on the Hub - although we pin a revision, the code might change on the Hub, and our tests won't detect any failure if any for new code.
But as it is already there, let's keep it - just not to add any more remote model into the relevant tests 🙏
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Quite aligned with @ydshieh here. Don't think we should be testing a code on the hub with trust_remote_code
. Why not use the model now that it is on transformers?
EDIT: read the original PR, guess it's okay, no a big fan of testing remote codes 😅
* fix failing 8bit test * trigger CI
What does this PR do?
Fixes two failing tests in https://github.com/huggingface/transformers/actions/runs/5873964880/job/15928072870
tests/quantization/bnb/test_mixed_int8.py::MixedInt8Test::test_get_keys_to_not_convert
&tests/quantization/bnb/test_mixed_int8.py::MixedInt8GPT2Test::test_get_keys_to_not_convert
Context: #25105 added stronger checks to enable the correct quantization of models on the Hub. Therefore it added a test that checks if
mpt-7b
is correctly quantized. Since that model requires einops to be added as a dependency I propose to simply add einops in the docker imagecc @ydshieh