Skip to content

Anthropic error pattern not recognized for context window detection #4381

@npkanaka

Description

@npkanaka

Problem

When using Anthropic/Claude models with respect_context_window=True, context window errors are not detected because CrewAI's error pattern matching doesn't recognize Anthropic's error message format.

Anthropic error: "prompt is too long: 210094 tokens > 200000 maximum"

Impact

The respect_context_window=True flag enables automatic summarization when context limits are exceeded, but this feature doesn't work for Anthropic models because the error detection fails.

Root Cause

File: crewai/utilities/exceptions/context_window_exceeding_exception.py

The CONTEXT_LIMIT_ERRORS list only includes OpenAI-style error patterns:

CONTEXT_LIMIT_ERRORS: Final[list[str]] = [
    "expected a string with maximum length",
    "maximum context length",
    "context length exceeded",
    "context_length_exceeded",
    "context window full",
    "too many tokens",
    "input is too long",
    "exceeds token limit",
]

But Anthropic returns: "prompt is too long: 210094 tokens > 200000 maximum"

Related

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions