Skip to content

[Internal]: Elastic Managed LLM: Default AI Model for Elastic AI Assistant [Security Docs] #1754

Open
@dhru42

Description

@dhru42

Description

We are releasing Elastic Managed LLM as the default large language model (LLM) available out of the box for users of Elastic [Search, Observability, and Security] AI Assistant. This new capability enables customers to leverage generative AI features immediately, without the need to bring their own model keys or manage external LLM connectors. The goal is to reduce adoption friction and streamline productivity for enterprise teams by providing a secure, managed LLM experience.

Key points to document:

  • What is Elastic Managed LLM?
    A default, Elastic-hosted LLM pre-integrated across the Elastic platform, available for Security AI Assistant users without additional setup.

  • How to access and use:

    • The Elastic Managed LLM is automatically available as the default model in the AI Assistant for eligible users and deployments.
    • No manual connector setup or API key management is required for initial use.
    • Users can still opt to configure and use their own LLM connectors (OpenAI, Azure, Bedrock, etc.) if desired.
  • Benefits:

    • Instant access to generative AI features for security use cases.
    • Eliminates the need for users to procure/manage third-party API keys.
    • Reduces time-to-value for teams adopting AI-powered workflows.
  • Security and data privacy:

    • Data sent to the Elastic Managed LLM is encrypted in transit.
    • The model is configured for zero data retention—no prompts or outputs are stored.
    • Only request metadata (timestamp, model, region, status) is logged for operational purposes.
    • Hosted initially in AWS us-east-1, with plans for regional expansion.
  • Fallback and configuration:

    • Users can select a different LLM provider through the model dropdown if they prefer a custom or third-party LLM.

Resources

Which documentation set does this change impact?

Elastic On-Prem and Cloud (all)

Feature differences

  • Serverless: The Elastic Managed LLM is not visible or available for configuration in serverless environments where project type does not support it.
  • On-Prem/Cloud: Available as default; users can opt out or override with their own connector.

What release is this request related to?

N/A

Serverless release

N/A

Collaboration model

The documentation team

Point of contact.

Main contact: @dhru42
Stakeholders: @peluja1012 , @jamesspi , @isaclfreire

Metadata

Metadata

Assignees

Labels

Team:DeveloperIssues owned by the Developer Docs TeamTeam:ExperienceIssues owned by the Experience Docs Team

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions