Closed
Description
Anything you want to discuss about vllm.
Hi team,
It seems that vllm version 0.6.2 is having some issues when used with models like Qwen-VL-2, Pixtral, and others (#9068, #9091...).
Most of the bugs have already been fixed in the current dev branch, but they haven’t been included in a new version yet, which makes it a bit tricky to use in a production environment.
I'm really grateful for all the hard work you're putting into this project! It would be awesome if there could be more frequent version releases, especially patch versions that address bug fixes.
Thanks a lot!
Michael
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.