Skip to content

[Security][8.18] Introduce Elastic Inference Service #278

Closed
@benironside

Description

@benironside

Description

Since the 1st AI feature, we've taken the approach of having a different connector for each supported LLM, and, we always operated on a "Bring your own LLM" model, meaning, we didn't offer an LLM service as Elastic.

This is now changing with the introduction of the Elastic Inference Service, and we will be offering a default LLM. We need to make all the necessary changes for our AI features to be able to support using the Inference Service via a new Kibana connector.

More info here: https://github.com/elastic/security-team/issues/11369

Resources

More details here: https://github.com/elastic/security-team/issues/10938

Which documentation set does this change impact?

Cloud and Serverless

Feature differences

What release is this request related to?

8.18

Collaboration model

The documentation team

Point of contact.

Main contact: @Charelzard

Stakeholders:

Metadata

Metadata

Labels

Team:SecurityIssues owned by the Security Docs TeamdocumentationImprovements or additions to documentationenhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions