Skip to content

[Feature]: add litellm for llm completion and tool-call #91

@gusye1234

Description

@gusye1234

Feature Description

Currently, acontext-core uses self-written llm abstract to support OpenAI and Anthropic SDK, which is not necessary, easy to have bugs, and may not support some latest models (gpt-5.2, gemini...).

We plan to use litellm to proxy llm SDK.

Impact Component/Area

Which part of Acontext would this feature affect?

  • Client SDK (Python)
  • Client SDK (TypeScript)
  • Core Service (Python)
  • API Server (Go)
  • UI/Dashboard (Next.js)
  • CLI Tool
  • Documentation
  • Other (please specify)

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions