Skip to content

Chunk size increases for long streams #1990

Closed
@pedrocarnevale

Description

@pedrocarnevale

Description

Hello everyone, I have a NextJs project that has this code in the API:

const response = await openai.chat.completions.create({
      model: 'gpt-3.5-turbo-0125',
      stream: true,
      temperature: 0.7,
      max_tokens: 4000,
      messages: [
        {
          role: 'system',
          content: systemMessage,
        },
        {
          role: 'user',
          content: humanMessage,
        },
      ],
    });
    
    const stream = OpenAIStream(response)
    
    return new StreamingTextResponse(
      stream,
      {
        headers: {
          resultDbId,
          outputDbId,
        },
      },
      searchResult
    );

The text generated by the stream is output on the screen as soon as it comes from the API, on real time. I notice that for small texts it works perfectly, as the frontend receives almost one character at a time from the API. But for long texts, it receives lines of texts instead of just one character at a time, and I don't want this effect. How do I fix this?

Code example

No response

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    ai/uibugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions