Skip to content

EPIC: dotAI: Support Additional Vendors #32634

@fmontes

Description

@fmontes

dotCMS only supports OpenAI's direct API, customer data and content are sent to OpenAI servers, which can be a blocker for privacy-sensitive customers. This epic delivers the ability for customers to choose among multiple AI providers (Azure OpenAI, AWS Bedrock, Google Vertex AI) using a provider-agnostic integration layer (LangChain4j).

This unlocks key enterprise adoption blockers for AI features in dotCMS and future-proofs the architecture for additional AI models and providers as the market evolves.

Important Links

Library will be using: https://docs.langchain4j.dev/

Out of scope

No new AI features beyond provider flexibility — prompt enhancements, new AI content tools, or UX improvements will be separate epics. One provider for all dotAI features, orchestration in another epic.

User Stories

  • As a developer at an enterprise customer, I want to use my company's preferred AI provider (Azure, AWS, or Google), so that my data stays within my cloud infrastructure.
  • As a system administrator, I want to configure and switch AI providers with no downtime, so that I can meet compliance and privacy requirements.
  • As a product manager, I want to maintain backward compatibility with OpenAI configurations, so that existing customers experience zero disruption.
  • As a backend engineer, I want a single, unified interface for calling AI services, so that adding new providers does not create maintenance overhead.

Key Execution Principles

  • Every goal ends with working, testable code — not a prototype that rots.
  • Always backward-compatible for existing OpenAI customers until they switch.
  • No big-bang release: each phase adds real customer value.
  • Spike work always feeds clear implementation outcomes.

Sub-issues

Metadata

Metadata

Type

Projects

Status

New

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions