Description
Describe the issue
I am able to host and load the local vllm endpoint. The issue is in adding additional data payload params to /chat/completions endpoint, I need to send few additional params .
I used the Autogen studio UI to set the api endpoint, api key etc. But there are no options to add additional data payload.
Steps to reproduce
For example, I will need to send the data payload like this to the vllm endpoint. How to configure this in autopen studio model endpoint.
===Example
import requests
-API endpoint
url = "http://localhost:8001/v1/chat/completions"
-Headers for the request
headers = {"Content-Type": "application/json"}
-Data payload
data = {
"messages": [{"content": "list of songs in 1900"}],
"use_context": True, # Set to True if needed
"context_filter": None, # Provide context filter if required
"include_sources": False, # Set to True to include sources
"stream": False # Set to True for streaming
}
-Making the POST request
response = requests.post(url, headers=headers, json=data)
==========end of example======
Please suggest ways of how it can be done in autogen studio. Any custom code can be added to support this.
Screenshots and logs
No response
Additional Information
No response