Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FR: Streaming request #19

Closed
svilupp opened this issue Mar 26, 2023 · 3 comments
Closed

FR: Streaming request #19

svilupp opened this issue Mar 26, 2023 · 3 comments

Comments

@svilupp
Copy link
Contributor

svilupp commented Mar 26, 2023

It would be excellent to add streaming as an option for OpenAI requests.

Given how long it takes to generate a response, seeing the partial response coming through would be better than a progress bar (and allows users to react/think through the response).

Server-Sent Events are implemented in OpenAI API and their cookbook has examples for the Python SDK SSE

We should be able to achieve the same with HTTP.jl, see Request Streaming.

As always, I’m happy to take a stab at it if there is interest.

@ThatcherC
Copy link
Contributor

I have some code for this I was developing off of v0.6.0! I'll try to bring it up to v0.8.0 and put out a PR. I didn't quite figure out how to make the callbacks work nicely but it might not be too far off.

@svilupp
Copy link
Contributor Author

svilupp commented Mar 26, 2023

Awesome! I look forward to testing it out - it would make the wait so much less painful!

My next stop is to build out the self-iterative loops (ala CodeT/Reflexion) and, potentially, parallelize different tasks to speed up time to a ready solution.

@roryl23
Copy link
Collaborator

roryl23 commented Mar 26, 2023

This is a fantastic idea

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants