Skip to content

Conversation

@PeterDaveHello
Copy link
Contributor

@PeterDaveHello PeterDaveHello commented Nov 29, 2025

User description

GitHub Copilot Summary:

This pull request expands support for new Anthropic Claude model versions and regional endpoints by updating the token limit mapping in pr_agent/algo/__init__.py. The main changes are the addition of several new Claude model identifiers and their associated token limits, including new versions and Bedrock regional endpoints.

Model and Token Limit Updates

  • Added support for new Claude model versions in Vertex AI, including claude-haiku-4-5@20251001, claude-opus-4-1@20250805, and claude-opus-4-5@20251101 with increased token limits.
  • Added support for new Claude model versions in Anthropic, including claude-opus-4-1-20250805, claude-opus-4-5-20251101, and claude-haiku-4-5-20251001, each with a 200,000 token limit.

Bedrock Regional Endpoint Expansion

  • Added new Bedrock endpoints for the latest Claude models, including support for US, EU, AU, JP, APAC, and global regions for claude-haiku-4-5-20251001 and claude-opus-4-5-20251101, all with 200,000 token limits.
  • Added additional Bedrock endpoints for claude-sonnet-4-5-20250929 and claude-sonnet-4-20250514 in global, AU, EU, JP, and APAC regions.

These updates ensure that the system can recognize and handle new Claude model versions and endpoints, maintaining compatibility and scalability as new models are released.

Reference:


PR Type

Enhancement


Description

  • Add support for latest Claude 4-4.5 series models across multiple platforms

  • Expand Vertex AI model entries with new Claude versions and token limits

  • Add Anthropic direct API support for new Claude model versions

  • Extend Bedrock regional endpoints for Claude models across US, EU, AU, JP, APAC, and global regions


Diagram Walkthrough

flowchart LR
  A["New Claude Models<br/>4-4.5 Series"] --> B["Vertex AI<br/>Endpoints"]
  A --> C["Anthropic<br/>Direct API"]
  A --> D["Bedrock<br/>Regional Endpoints"]
  B --> E["Token Limits<br/>200K"]
  C --> E
  D --> E
  D --> F["Multi-Region<br/>Support"]
Loading

File Walkthrough

Relevant files
Enhancement
__init__.py
Expand Claude model support across platforms and regions 

pr_agent/algo/init.py

  • Added 3 new Vertex AI Claude model entries: claude-haiku-4-5@20251001,
    claude-opus-4-1@20250805, and claude-opus-4-5@20251101 with 200K token
    limits
  • Added 4 new Anthropic direct API model entries:
    claude-opus-4-1-20250805, claude-opus-4-5-20251101,
    claude-haiku-4-5-20251001 with 200K token limits
  • Added 15 new Bedrock regional endpoint entries supporting Claude 4-4.5
    models across US, EU, AU, JP, APAC, and global regions with 200K token
    limits
  • All new entries maintain consistent 200K token limit configuration for
    latest Claude versions
+22/-0   

@qodo-merge-for-open-source
Copy link
Contributor

PR Compliance Guide 🔍

Below is a summary of compliance checks for this PR:

Security Compliance
🟢
No security concerns identified No security vulnerabilities detected by AI analysis. Human verification advised for critical code.
Ticket Compliance
🎫 No ticket provided
  • Create ticket/issue
Codebase Duplication Compliance
Codebase context is not defined

Follow the guide to enable codebase context checks.

Custom Compliance
🟢
Consistent Naming Conventions

Objective: All new variables, functions, and classes must follow the project's established naming
standards

Status: Passed

No Dead or Commented-Out Code

Objective: Keep the codebase clean by ensuring all submitted code is active and necessary

Status: Passed

Robust Error Handling

Objective: Ensure potential errors and edge cases are anticipated and handled gracefully throughout
the code

Status: Passed

Single Responsibility for Functions

Objective: Each function should have a single, well-defined responsibility

Status: Passed

When relevant, utilize early return

Objective: In a code snippet containing multiple logic conditions (such as 'if-else'), prefer an
early return on edge cases than deep nesting

Status: Passed

Compliance status legend 🟢 - Fully Compliant
🟡 - Partial Compliant
🔴 - Not Compliant
⚪ - Requires Further Human Verification
🏷️ - Compliance label

@qodo-merge-for-open-source
Copy link
Contributor

PR Code Suggestions ✨

Explore these optional code suggestions:

CategorySuggestion                                                                                                                                    Impact
High-level
Refactor model configuration for scalability

Refactor the MAX_TOKENS dictionary to avoid exhaustively listing every regional
model variant. Instead, handle these variations programmatically, for instance
by using pattern matching, to improve maintainability.

Examples:

pr_agent/algo/__init__.py [134-139]
    "bedrock/us.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    "bedrock/eu.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    "bedrock/au.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    "bedrock/jp.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    "bedrock/apac.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    "bedrock/global.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,

Solution Walkthrough:

Before:

MAX_TOKENS = {
    # ... other models
    "bedrock/us.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    "bedrock/eu.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    "bedrock/au.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    "bedrock/jp.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    "bedrock/apac.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    "bedrock/global.anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    # ... many more similar entries
}

def get_max_tokens(model_name):
    return MAX_TOKENS.get(model_name, DEFAULT_TOKENS)

After:

BASE_MAX_TOKENS = {
    "anthropic.claude-haiku-4-5-20251001-v1:0": 200000,
    "anthropic.claude-opus-4-1-20250805-v1:0": 200000,
    # ... other base models without regional prefixes
}
# Keep specific overrides if needed
MAX_TOKENS = { ... }

def get_max_tokens(model_name):
    if model_name in MAX_TOKENS:
        return MAX_TOKENS[model_name]

    if model_name.startswith("bedrock/"):
        # Strip regional prefixes like 'us.', 'eu.', 'global.'
        base_model_name = re.sub(r'bedrock/(us|eu|au|jp|apac|global)\.', 'bedrock/', model_name)
        if base_model_name in MAX_TOKENS:
             return MAX_TOKENS[base_model_name]
    
    # Fallback logic
    return DEFAULT_TOKENS
Suggestion importance[1-10]: 7

__

Why: The suggestion correctly identifies a scalability issue in the MAX_TOKENS dictionary, where numerous regional model variants are listed explicitly. Proposing a programmatic approach is a valid architectural improvement for long-term maintainability, although it's not a critical bug.

Medium
  • More
  • Author self-review: I have reviewed the PR code suggestions, and addressed the relevant ones.

'bedrock/anthropic.claude-sonnet-4-20250514-v1:0': 200000,
'bedrock/anthropic.claude-sonnet-4-5-20250929-v1:0': 200000,
"bedrock/us.anthropic.claude-opus-4-20250514-v1:0": 200000,
"bedrock/us.anthropic.claude-opus-4-1-20250805-v1:0": 200000,
Copy link

@SeverineVerlinden SeverineVerlinden Dec 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"bedrock/us.anthropic.claude-opus-4-5-20251101-v1:0"

Non-blocking: Could you also add the following model; I see the global deployment is added; but regionals could be beneficial as well? Otherwise happy to handle in a separate MR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants