Skip to content

Issues: vllm-project/vllm

[RFC]: Deprecating vLLM V0
#18571 opened May 22, 2025 by WoosukKwon
Open 30
[Roadmap] vLLM Roadmap Q2 2025
#15735 opened Mar 29, 2025 by simon-mo
Open 15
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Issues list

[Misc]: why 3B-Instruct-AWQ takes 16G misc
#15204 opened Mar 20, 2025 by shaojun
1 task done
[Misc]: asyncio requests and continuous batching misc
#14559 opened Mar 10, 2025 by Bodoral
1 task done
[Misc]: running multiple vLLM instances on a single ray cluster misc ray anything related with ray stale Over 90 days of inactivity
#14277 opened Mar 5, 2025 by gitlawr
1 task done
[Misc]: Why do we need to explicitly pass tool parsers? misc stale Over 90 days of inactivity
#13399 opened Feb 17, 2025 by Datta0
1 task done
[Misc]: How does vllm consume request in async mode? misc stale Over 90 days of inactivity
#13328 opened Feb 15, 2025 by mru4913
1 task done
[Misc]: Regarding the issue of inconsistent calculation of tokens misc stale Over 90 days of inactivity
#13256 opened Feb 14, 2025 by lgy1027
1 task done
[Feature]: Support Python 3.13 misc
#12083 opened Jan 15, 2025 by manueldeprada
1 task done
ProTip! Type g p on any issue or pull request to go back to the pull request listing page.