Skip to content

Conversation

pooyadavoodi
Copy link
Contributor

The run_batch API currently creates error responses for requests that fail inside the engine. Error responses appear in the output file indicating which request ID has failed. This PR makes sure error responses are also created for requests that fail due to invalid url endpoints or unsupported cases such as streaming.

…batch

The run_batch API currently creates error responses for requests that
fail inside the engine. Error responses appear in the output file
indicating which request ID has failed. This PR makes sure error
responses are also created for requests that fail due to invalid url
endpoints or unsupported cases such as streaming.
Copy link

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can do one of these:

  • Add ready label to the PR
  • Enable auto-merge.

🚀

Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the QoL improvement!

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) September 11, 2024 02:50
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Sep 11, 2024
@DarkLight1337 DarkLight1337 merged commit cea95df into vllm-project:main Sep 11, 2024
69 checks passed
dtrifiro pushed a commit to opendatahub-io/vllm that referenced this pull request Sep 12, 2024
Alvant pushed a commit to compressa-ai/vllm that referenced this pull request Oct 26, 2024
garg-amit pushed a commit to garg-amit/vllm that referenced this pull request Oct 28, 2024
…batch (vllm-project#8347)

Signed-off-by: Amit Garg <mitgarg17495@gmail.com>
LeiWang1999 pushed a commit to LeiWang1999/vllm-bitblas that referenced this pull request Mar 26, 2025
…batch (vllm-project#8347)

Signed-off-by: LeiWang1999 <leiwang1999@outlook.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants