Skip to content

Commit

Permalink
[Misc] Remove vllm-project#4789 workaround left in vllm/entrypoints/o…
Browse files Browse the repository at this point in the history
…penai/run_batch.py (vllm-project#5756)
  • Loading branch information
zifeitong authored and jimpang committed Jul 8, 2024
1 parent 5216b0d commit efe5201
Showing 1 changed file with 0 additions and 4 deletions.
4 changes: 0 additions & 4 deletions vllm/entrypoints/openai/run_batch.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
import asyncio
import sys
from io import StringIO
from typing import Awaitable, List

Expand Down Expand Up @@ -137,9 +136,6 @@ async def main(args):
output_buffer.seek(0)
await write_file(args.output_file, output_buffer.read().strip())

# Temporary workaround for https://github.com/vllm-project/vllm/issues/4789
sys.exit(0)


if __name__ == "__main__":
args = parse_args()
Expand Down

0 comments on commit efe5201

Please sign in to comment.