Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Making vLLM compatible with Mistral fp8 weights. #10229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Uh oh!
There was an error while loading. Please reload this page.
Making vLLM compatible with Mistral fp8 weights. #10229
Changes from all commits
c8865cf
6937cc7
3518639
cd1ca2c
f61bc32
5a027f1
869923f
6cb6bad
40e37e2
cf88baf
1723734
133881a
27595b7
3ad46bc
c98fa33
dcbd25f
ed1cb8c
226be71
cdbaf7e
4e756a7
File filter
Filter by extension
Conversations
Uh oh!
There was an error while loading. Please reload this page.
Jump to
Uh oh!
There was an error while loading. Please reload this page.
There are no files selected for viewing
Check failure on line 632 in vllm/model_executor/models/llama.py
Ruff (E501)
Uh oh!
There was an error while loading. Please reload this page.