You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Confirm this is a feature request for the .NET library and not the underlying OpenAI API
This is a feature request for the .NET library
Describe the feature or improvement you are requesting
I am trying to implement function calling using the OpenAI Batch API as mentioned in the documentation here. The documentation states that function calling is supported in both the Chat Completions API, Assistants API, and the Batch API. However, I am having trouble getting it to work with the Batch API.
Despite following the guidelines provided, I cannot seem to find a way to utilize function calling effectively within the Batch API. If anyone has experience or insights regarding this functionality, I would greatly appreciate your assistance.
Additional context
I have reviewed the examples provided for function calling in the Chat Completions API and Assistants API, but I haven't found similar examples for the Batch API. My goal is to implement a conversational assistant that can process multiple requests in a batch while utilizing function calling to retrieve and respond with real-time data.
I am particularly interested in understanding the specific requirements or limitations when using function calling with the Batch API. If there are any differences in implementation compared to the other APIs, that information would be very helpful.
The text was updated successfully, but these errors were encountered:
I think the Batch API in this case is more about delayed execution of a large volume (within 24 hours and with a discount) than about fast parallel processing of requests, which I think you mean and this happens in some other APIs
In general, this is just the same series of Messages, which will contain function calls/results of these calls, and I don't see any difference here
Confirm this is a feature request for the .NET library and not the underlying OpenAI API
Describe the feature or improvement you are requesting
I am trying to implement function calling using the OpenAI Batch API as mentioned in the documentation here. The documentation states that function calling is supported in both the Chat Completions API, Assistants API, and the Batch API. However, I am having trouble getting it to work with the Batch API.
Despite following the guidelines provided, I cannot seem to find a way to utilize function calling effectively within the Batch API. If anyone has experience or insights regarding this functionality, I would greatly appreciate your assistance.
Additional context
I have reviewed the examples provided for function calling in the Chat Completions API and Assistants API, but I haven't found similar examples for the Batch API. My goal is to implement a conversational assistant that can process multiple requests in a batch while utilizing function calling to retrieve and respond with real-time data.
I am particularly interested in understanding the specific requirements or limitations when using function calling with the Batch API. If there are any differences in implementation compared to the other APIs, that information would be very helpful.
The text was updated successfully, but these errors were encountered: