-
Notifications
You must be signed in to change notification settings - Fork 5.9k
Open
Description
Problem
When using Anthropic/Claude models with respect_context_window=True, context window errors are not detected because CrewAI's error pattern matching doesn't recognize Anthropic's error message format.
Anthropic error: "prompt is too long: 210094 tokens > 200000 maximum"
Impact
The respect_context_window=True flag enables automatic summarization when context limits are exceeded, but this feature doesn't work for Anthropic models because the error detection fails.
Root Cause
File: crewai/utilities/exceptions/context_window_exceeding_exception.py
The CONTEXT_LIMIT_ERRORS list only includes OpenAI-style error patterns:
CONTEXT_LIMIT_ERRORS: Final[list[str]] = [
"expected a string with maximum length",
"maximum context length",
"context length exceeded",
"context_length_exceeded",
"context window full",
"too many tokens",
"input is too long",
"exceeds token limit",
]But Anthropic returns: "prompt is too long: 210094 tokens > 200000 maximum"
Related
- PR fix: Add Anthropic error pattern for context window detection #4371 - Implements this fix
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels