Skip to content

[BUG] respect_context_window doesn't work #2659

Closed
@maxkochanoff

Description

@maxkochanoff

Description

Sometimes I run into a litellm error about exceeding the context window, even though the respect_context_window parameter is set to True by default. I thought respect_context_window was supposed to prevent that by automatically summarizing the text. So now I'm not sure if it's a bug or if I'm just doing something wrong. I would be very grateful if you could help me please.

P.S. In my setup, I do have some tools that are pretty lengthy (but definitely still under the maximum context length). Maybe the problem occurs when an Agent retries to call a tool after a failure (in this case there would be a lot of text in the messages).

Steps to Reproduce

None

Expected behavior

No errors

Screenshots/Code snippets

None

Operating System

Ubuntu 20.04

Python Version

3.10

crewAI Version

0.114.0

crewAI Tools Version

0.40.1

Virtual Environment

Venv

Evidence

The exact error:

litellm.ContextWindowExceededError: litellm.BadRequestError: ContextWindowExceededError: MistralException - Error code: 400 - {'object': 'error', 'message': "This model's maximum context lenght is 15000 tokens. However, you requested 15018 tokens (12970) in the messages, 2048 in the completion). Please reduce the length of the messages or completion.", type: 'BadRequestError', 'param': None, 'code': 400}

Possible Solution

None

Additional context

None

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions