-
Notifications
You must be signed in to change notification settings - Fork 3
Open
Description
Summary
The hardcoded PROVIDER_CAPABILITIES dict in app/services/ai/models/__init__.py is redundant now that we have the LLM catalog database tables.
Current State
Hardcoded dict with:
supports_streamingsupports_function_callingsupports_visionfree_tier_available
PROVIDER_CAPABILITIES = {
AIProvider.OPENAI: ProviderCapabilities(
provider=AIProvider.OPENAI,
supports_streaming=True,
supports_function_calling=True,
supports_vision=True,
free_tier_available=False,
),
# ... etc
}Proposed Solution
- Add capability fields to
LLMVendortable (or createLLMVendorCapabilitytable) - Populate capabilities via the LLM sync ETL from OpenRouter/LiteLLM metadata
- Replace
get_provider_capabilities()to query database instead of hardcoded dict - Update
ai providersCLI command to use database-driven capabilities - Remove
PROVIDER_CAPABILITIESdict andProviderCapabilitiesmodel
Benefits
- Single source of truth (database)
- Capabilities auto-update with LLM sync
- No need to manually maintain hardcoded values
- Extensible for new providers without code changes
Files to Modify
app/services/ai/models/__init__.py.j2/.jinjaapp/services/ai/models/llm/llm_vendor.pyapp/services/ai/etl/llm_sync_service.pyapp/cli/ai.py.j2/.jinja
Metadata
Metadata
Assignees
Labels
No labels