Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Difficulty Using Function Calling in OpenAI Batch API #245

Open
1 task done
mehrdad-tat opened this issue Oct 7, 2024 · 1 comment
Open
1 task done

Difficulty Using Function Calling in OpenAI Batch API #245

mehrdad-tat opened this issue Oct 7, 2024 · 1 comment

Comments

@mehrdad-tat
Copy link

Confirm this is a feature request for the .NET library and not the underlying OpenAI API

  • This is a feature request for the .NET library

Describe the feature or improvement you are requesting

I am trying to implement function calling using the OpenAI Batch API as mentioned in the documentation here. The documentation states that function calling is supported in both the Chat Completions API, Assistants API, and the Batch API. However, I am having trouble getting it to work with the Batch API.

Despite following the guidelines provided, I cannot seem to find a way to utilize function calling effectively within the Batch API. If anyone has experience or insights regarding this functionality, I would greatly appreciate your assistance.

Screenshot 2024-10-07 085831

Additional context

I have reviewed the examples provided for function calling in the Chat Completions API and Assistants API, but I haven't found similar examples for the Batch API. My goal is to implement a conversational assistant that can process multiple requests in a batch while utilizing function calling to retrieve and respond with real-time data.

I am particularly interested in understanding the specific requirements or limitations when using function calling with the Batch API. If there are any differences in implementation compared to the other APIs, that information would be very helpful.

@HavenDV
Copy link
Contributor

HavenDV commented Oct 9, 2024

I think the Batch API in this case is more about delayed execution of a large volume (within 24 hours and with a discount) than about fast parallel processing of requests, which I think you mean and this happens in some other APIs
In general, this is just the same series of Messages, which will contain function calls/results of these calls, and I don't see any difference here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants