Skip to content

Local memory and context window length #177

Open
@raunak-kondiboyina

Description

@raunak-kondiboyina

Question

  1. Is there a way that same questions are not asked again like by maintaining a local or session memory to avoid infinite loops?
  2. how does the sdk handle context window? for example, if I made a tool call which fetched data whose size exceeds input context window, will the sdk auto summarise the result and make next llm call or will it break here?
    Example -
    openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid 'input[2].output': string too long. Expected a string with maximum length 256000, but got a string with length 288701 instead.", 'type': 'invalid_request_error', 'param': 'input[2].output', 'code': 'string_above_max_length'}}

getting the above error, but it is actually possible to summarise and send the response instead

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionQuestion about using the SDK

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions