Skip to content

[TECH DEBT] Support TLS configuration for Non-OpenAI LLM ProvidersΒ #1122

@inFocus7

Description

@inFocus7

πŸ“‹ Prerequisites

πŸ“ Feature Summary

Add support for ModelConfig TLS configuration for non-openai agent

❓ Problem Statement / Motivation

Currently a ModelConfig's TLS config is only applied for openai providers' clients. We are missing this support for other clients (e.g. Anthropic, Gemini, Ollama).

πŸ’‘ Proposed Solution

  • Investigate the other providers' clients to implement support for attaching the ca certs.
  • Add TLS configuration support for the missing providers' clients, where possible.
  • Add tests to validate support.
  • Ensure backwards compatibility (i.e. use existing configuration if no TLS config is set).

πŸ”„ Alternatives Considered

No response

🎯 Affected Service(s)

None

πŸ“š Additional Context

The config already exists on the kubernetes/go side (crd + translation), it is set on the BaseLLM, which means we just need to pass it down to other providers and configure their clients with the settings.

base

The original implementation PR was #1059, which includes extensive tests, so expanding on tests is hopefully not too difficult.

πŸ™‹ Are you willing to contribute?

  • I am willing to submit a PR for this feature

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    Status

    Backlog

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions