Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chunk size increases for long streams #1990

Open
pedrocarnevale opened this issue Jun 17, 2024 · 0 comments
Open

Chunk size increases for long streams #1990

pedrocarnevale opened this issue Jun 17, 2024 · 0 comments
Labels
ai/ui bug Something isn't working

Comments

@pedrocarnevale
Copy link

Description

Hello everyone, I have a NextJs project that has this code in the API:

const response = await openai.chat.completions.create({
      model: 'gpt-3.5-turbo-0125',
      stream: true,
      temperature: 0.7,
      max_tokens: 4000,
      messages: [
        {
          role: 'system',
          content: systemMessage,
        },
        {
          role: 'user',
          content: humanMessage,
        },
      ],
    });
    
    const stream = OpenAIStream(response)
    
    return new StreamingTextResponse(
      stream,
      {
        headers: {
          resultDbId,
          outputDbId,
        },
      },
      searchResult
    );

The text generated by the stream is output on the screen as soon as it comes from the API, on real time. I notice that for small texts it works perfectly, as the frontend receives almost one character at a time from the API. But for long texts, it receives lines of texts instead of just one character at a time, and I don't want this effect. How do I fix this?

Code example

No response

Additional context

No response

@lgrammel lgrammel added bug Something isn't working ai/ui labels Jun 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai/ui bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants