Skip to content

Conversation

ronaldosaheki
Copy link
Contributor

Pull Request Description

Currently when workload file constant.jsonl has model:null, even if we set default_model in client.py, it will be then use Model:None/null and ignore the default_model and throw the error:

ERROR:root:Request 2: Error (BadRequestError): Error code: 400 - {'object': 'error', 'message': "[{'type': 'string_type', 'loc': ('body', 'model'), 'msg': 'Input should be a valid string', 'input': None}]", 'type': 'BadRequestError', 'param': None, 'code': 400}

With the fix, the client default_model is used in case workload file has model: null

{"timestamp": 5000, "requests": [{"prompt": "Certainly, here is the
continuation of the agreement:\n\nTHIS AGREEMENT constitutes the entire
agreement between the parties and supersedes all prior negotiations,
...
Authorized Signature]\n[Insert Authorized Printed Name]\n[Insert
Date]\n\n[Insert Client Name]\n\n---\n\n[Insert Authorized
Signature]\n[Insert Authorized Printed Name]\n[Insert Date]", "model":
null, "prompt_length": 477, "output_length": 7}]}

Running successfully:

export API_KEY=${API_KEY}
python3 client.py \
--workload-path "../generator/output/constant.jsonl" \
--endpoint "http://localhost:8000" \
--model deepseek-llm-7b-chat \
--api-key ${API_KEY} \
--streaming \
--output-file-path output.jsonl 
...<snippet>
INFO:root:Request 296: Starting streaming request to http://localhost:8000
INFO:root:Request 82: Completed successfully. Tokens: 2527, Latency: 214.43s
INFO:httpx:HTTP Request: POST http://localhost:8000/v1/chat/completions "HTTP/1.1 200 OK"
WARNING:root:Prepare to launch 1 streaming tasks after 0.9986288547515869
INFO:root:Request 297: Starting streaming request to http://localhost:8000
INFO:root:Request 124: Completed successfully. Tokens: 457, Latency: 173.07s
INFO:httpx:HTTP Request: POST http://localhost:8000/v1/chat/completions "HTTP/1.1 200 OK"
WARNING:root:Prepare to launch 1 streaming tasks after 0.9988501071929932
INFO:root:Request 298: Starting streaming request to http://localhost:8000
INFO:httpx:HTTP Request: POST http://localhost:8000/v1/chat/completions "HTTP/1.1 200 OK"
...</snippet>

Additional lint/changes:

  • syntax fixes by editor removing empty spaces
  • api-key should not be required, as we can deploy vllm without api-key

Important: Before submitting, please complete the description above and review the checklist below.


Contribution Guidelines (Expand for Details)

We appreciate your contribution to aibrix! To ensure a smooth review process and maintain high code quality, please adhere to the following guidelines:

Pull Request Title Format

Your PR title should start with one of these prefixes to indicate the nature of the change:

  • [Bug]: Corrections to existing functionality
  • [CI]: Changes to build process or CI pipeline
  • [Docs]: Updates or additions to documentation
  • [API]: Modifications to aibrix's API or interface
  • [CLI]: Changes or additions to the Command Line Interface
  • [Misc]: For changes not covered above (use sparingly)

Note: For changes spanning multiple categories, use multiple prefixes in order of importance.

Submission Checklist

  • PR title includes appropriate prefix(es)
  • Changes are clearly explained in the PR description
  • New and existing tests pass successfully
  • Code adheres to project style and best practices
  • Documentation updated to reflect changes (if applicable)
  • Thorough testing completed, no regressions introduced

By submitting this PR, you confirm that you've read these guidelines and your changes align with the project's contribution standards.

The client default_model should be used in case workload file has `model:
null`

```
{"timestamp": 5000, "requests": [{"prompt": "Certainly, here is the
continuation of the agreement:\n\nTHIS AGREEMENT constitutes the entire
agreement between the parties and supersedes all prior negotiations,
...
Authorized Signature]\n[Insert Authorized Printed Name]\n[Insert
Date]\n\n[Insert Client Name]\n\n---\n\n[Insert Authorized
Signature]\n[Insert Authorized Printed Name]\n[Insert Date]", "model":
null, "prompt_length": 477, "output_length": 7}]}
```

Signed-off-by: Ronaldo Saheki <ronaldo.saheki@workato.com>
@happyandslow happyandslow self-requested a review March 20, 2025 18:45
Copy link
Collaborator

@happyandslow happyandslow left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@happyandslow happyandslow merged commit 7ca5ff7 into vllm-project:main Mar 20, 2025
3 checks passed
gangmuk pushed a commit to gangmuk/aibrix-gangmuk that referenced this pull request Jun 21, 2025
…efault_model (vllm-project#887)

[Misc] Fix when workload has model null and client has default_model

The client default_model should be used in case workload file has `model:
null`

```
{"timestamp": 5000, "requests": [{"prompt": "Certainly, here is the
continuation of the agreement:\n\nTHIS AGREEMENT constitutes the entire
agreement between the parties and supersedes all prior negotiations,
...
Authorized Signature]\n[Insert Authorized Printed Name]\n[Insert
Date]\n\n[Insert Client Name]\n\n---\n\n[Insert Authorized
Signature]\n[Insert Authorized Printed Name]\n[Insert Date]", "model":
null, "prompt_length": 477, "output_length": 7}]}
```

Signed-off-by: Ronaldo Saheki <ronaldo.saheki@workato.com>
Yaegaki1Erika pushed a commit to Yaegaki1Erika/aibrix that referenced this pull request Jul 23, 2025
…efault_model (vllm-project#887)

[Misc] Fix when workload has model null and client has default_model

The client default_model should be used in case workload file has `model:
null`

```
{"timestamp": 5000, "requests": [{"prompt": "Certainly, here is the
continuation of the agreement:\n\nTHIS AGREEMENT constitutes the entire
agreement between the parties and supersedes all prior negotiations,
...
Authorized Signature]\n[Insert Authorized Printed Name]\n[Insert
Date]\n\n[Insert Client Name]\n\n---\n\n[Insert Authorized
Signature]\n[Insert Authorized Printed Name]\n[Insert Date]", "model":
null, "prompt_length": 477, "output_length": 7}]}
```

Signed-off-by: Ronaldo Saheki <ronaldo.saheki@workato.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants