Closed
Description
Your current environment
The output of `python collect_env.py`
How would you like to use vllm
I want to run inference of a mistral 8X7B model.I want to utilize function calling / tools while inferencing using open ai compatible API endpoints but I am not able to get details around it. is this supported? if not what is the timeline of having this.