Skip to content

Conversation

@Milofax
Copy link

@Milofax Milofax commented Jan 17, 2026

Summary

The LLMClientFactory and EmbedderFactory in factories.py read the api_url from config but don't pass it to the client constructors. This causes:

  • AnthropicClient to always use https://api.anthropic.com instead of configured URL
  • OpenAIEmbedder to always use https://api.openai.com/v1 instead of configured URL

This breaks self-hosted/proxy setups where users want to route requests through custom endpoints.

Changes

  • Pass base_url=config.providers.anthropic.api_url to GraphitiLLMConfig for Anthropic
  • Pass base_url=config.providers.openai.api_url to OpenAIEmbedderConfig for OpenAI embedder

Test Plan

  • Tested with CLIProxyAPI (Anthropic-compatible proxy) - requests now correctly route to custom endpoint
  • Tested with Ollama (OpenAI-compatible embeddings) - embeddings now correctly use custom endpoint

🤖 Generated with Claude Code

The api_url from config was not being passed to the client constructors,
causing them to default to official API endpoints instead of custom ones.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant