Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ The following video demonstrates these steps (click to watch).

### Configure an IAM User [_configure_an_iam_user]

Next, assign the policy you just created to a new user:
Next, assign the policy you created to a new user:

1. Return to the **IAM** menu. Select **Users** from the navigation menu, then click **Create User**.
2. Name the user, then click **Next**.
Expand All @@ -82,7 +82,7 @@ The following video demonstrates these steps (click to watch).
Create the access keys that will authenticate your Elastic connector:

1. Return to the **IAM** menu. Select **Users** from the navigation menu.
2. Search for the user you just created, and click its name.
2. Search for the user you created, and click its name.
3. Go to the **Security credentials** tab.
4. Under **Access keys**, click **Create access key**.
5. Select **Third-party service**, check the box under **Confirmation**, click **Next**, then click **Create access key**.
Expand All @@ -102,7 +102,7 @@ Finally, configure the connector in {{kib}}:
2. Find the **Connectors** page in the navigation menu or use the [global search field](/explore-analyze/find-and-organize/find-apps-and-objects.md). Then click **Create Connector**, and select **Amazon Bedrock**.
3. Name your connector.
4. (Optional) Configure the Amazon Bedrock connector to use a different AWS region where Anthropic models are supported by editing the **URL** field, for example by changing `us-east-1` to `eu-central-1`.
5. (Optional) Add one of the following strings if you want to use a model other than the default. Note that these model IDs should have a prefix of `us.` or `eu.`, depending on your region, for example `us.anthropic.claude-3-5-sonnet-20240620-v1:0` or `eu.anthropic.claude-3-5-sonnet-20240620-v1:0`.
5. (Optional) Add one of the following strings if you want to use a model other than the default. These model IDs should have a prefix of `us.` or `eu.`, depending on your region, for example `us.anthropic.claude-3-5-sonnet-20240620-v1:0` or `eu.anthropic.claude-3-5-sonnet-20240620-v1:0`.

* Sonnet 3.5: `us.anthropic.claude-3-5-sonnet-20240620-v1:0` or `eu.anthropic.claude-3-5-sonnet-20240620-v1:0`
* Sonnet 3.5 v2: `us.anthropic.claude-3-5-sonnet-20241022-v2:0` or `eu.anthropic.claude-3-5-sonnet-20241022-v2:0`
Expand Down
50 changes: 50 additions & 0 deletions explore-analyze/ai-features/llm-guides/llm-connectors.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/security/current/llm-connector-guides.html
- https://www.elastic.co/guide/en/serverless/current/security-llm-connector-guides.html
applies_to:
stack: all
serverless:
security: all
observability: all
elasticsearch: all
products:
- id: observability
- id: elasticsearch
- id: security
- id: cloud-serverless
---

# Enable large language model (LLM) access

Elastic uses large language model (LLM) connectors to power its [AI features](/explore-analyze/ai-features.md#ai-powered-features-in-elastic-sec). These features with the out-of-the-box Elastic Managed LLM or by configuring a third-party LLM connector.

## Elastic Managed LLM

:::{include} ../../../solutions/_snippets/elastic-managed-llm.md
:::

## Connect to a third-party LLM

Follow these guides to connect to one or more third-party LLM providers:

* [Azure OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md)
* [Amazon Bedrock](/explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md)
* [OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-openai.md)
* [Google Vertex](/explore-analyze/ai-features/llm-guides/connect-to-google-vertex.md)

## Preconfigured connectors

```{applies_to}
stack: ga
serverless: unavailable
```

You can also use [preconfigured connectors](kibana://reference/connectors-kibana/pre-configured-connectors.md) to set up third-party LLM connectors by editing the `kibana.yml` file. This allows you enable a connector for multiple spaces at once, without performing set up in the {{kib}} UI for each space.

If you use a preconfigured connector for your LLM connector, we recommend adding the `exposeConfig: true` parameter to the `xpack.actions.preconfigured` section of the `kibana.yml` config file. This parameter makes debugging easier by adding configuration information to the debug logs, including which LLM the connector uses.





6 changes: 6 additions & 0 deletions explore-analyze/toc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -170,6 +170,12 @@ toc:
children:
- file: ai-features/ai-assistant.md
- file: ai-features/manage-access-to-ai-assistant.md
- file: ai-features/llm-guides/llm-connectors.md
children:
- file: ai-features/llm-guides/connect-to-azure-openai.md
- file: ai-features/llm-guides/connect-to-amazon-bedrock.md
- file: ai-features/llm-guides/connect-to-openai.md
- file: ai-features/llm-guides/connect-to-google-vertex.md
- file: discover.md
children:
- file: discover/discover-get-started.md
Expand Down
7 changes: 7 additions & 0 deletions redirects.yml
Original file line number Diff line number Diff line change
Expand Up @@ -600,3 +600,10 @@ redirects:

# Related to https://github.com/elastic/docs-content/pull/3808
'solutions/observability/get-started/other-tutorials/add-data-from-splunk.md': 'solutions/observability/get-started.md'


# Related to https://github.com/elastic/docs-content/pull/4224
'solutions/security/ai/connect-to-amazon-bedrock.md': 'explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md'
'solutions/security/ai/connect-to-azure-openai.md': 'explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md'
'solutions/security/ai/connect-to-google-vertex.md': 'explore-analyze/ai-features/llm-guides/connect-to-google-vertex.md'
'solutions/security/ai/connect-to-openai.md': 'explore-analyze/ai-features/llm-guides/connect-to-openai.md'
8 changes: 4 additions & 4 deletions solutions/_snippets/elastic-managed-llm.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
[Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in the AI Assistant for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration.
[Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) is the default large language model (LLM) connector available in {{kib}} for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration.

The Elastic Managed LLM is available out-of-the box; no manual connector setup or API key management is required for initial use. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.
Elastic Managed LLM is available out-of-the box; it does not require manual configuration or API key management. Alternatively, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock.

To learn more about security and data privacy, refer to the [connector documentation](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) and [download the model card](https://raw.githubusercontent.com/elastic/kibana/refs/heads/main/docs/reference/resources/Elastic_Managed_LLM_model_card.pdf).
To learn more about security and data privacy, refer to [Elastic Managed LLM](https://www.elastic.co/docs/reference/kibana/connectors-kibana/elastic-managed-llm) and [download the model card](https://raw.githubusercontent.com/elastic/kibana/refs/heads/main/docs/reference/resources/Elastic_Managed_LLM_model_card.pdf).

:::{important}
Using the Elastic Managed LLM incurs additional costs. Refer to [{{ecloud}} pricing](https://www.elastic.co/pricing/serverless-search) for more information.
Using Elastic Managed LLM incurs additional costs. Refer to [{{ecloud}} pricing](https://www.elastic.co/pricing/serverless-search) for more information.
:::
Original file line number Diff line number Diff line change
@@ -1,7 +1,4 @@
---
mapped_pages:
- https://www.elastic.co/guide/en/security/current/llm-connector-guides.html
- https://www.elastic.co/guide/en/serverless/current/security-llm-connector-guides.html
applies_to:
stack: all
serverless:
Expand All @@ -13,7 +10,7 @@ products:

# Enable large language model (LLM) access

{{elastic-sec}} uses large language models (LLMs) for some of its advanced analytics features. To enable these features, you can connect a third-party LLM provider or a custom local LLM.
{{elastic-sec}} uses large language model (LLM) connectors to power it's [AI features](/explore-analyze/ai-features.md#ai-powered-features-in-elastic-sec). To use these features, you can use Elastic Managed LLM, configure a third-party LLM connector, or a custom local LLM.

:::{important}
Different LLMs have varying performance when used to power different features and use-cases. For more information about how various models perform on different tasks in {{elastic-sec}}, refer to the [Large language model performance matrix](/solutions/security/ai/large-language-model-performance-matrix.md).
Expand All @@ -28,10 +25,10 @@ Different LLMs have varying performance when used to power different features an

Follow these guides to connect to one or more third-party LLM providers:

* [Azure OpenAI](/solutions/security/ai/connect-to-azure-openai.md)
* [Amazon Bedrock](/solutions/security/ai/connect-to-amazon-bedrock.md)
* [OpenAI](/solutions/security/ai/connect-to-openai.md)
* [Google Vertex](/solutions/security/ai/connect-to-google-vertex.md)
* [Azure OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md)
* [Amazon Bedrock](/explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md)
* [OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-openai.md)
* [Google Vertex](/explore-analyze/ai-features/llm-guides/connect-to-google-vertex.md)

## Connect to a self-managed LLM

Expand Down
4 changes: 0 additions & 4 deletions solutions/toc.yml
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
project: "Solutions and use cases"
toc:
- file: index.md
Expand Down Expand Up @@ -568,10 +568,6 @@
- file: security/ai/set-up-connectors-for-large-language-models-llm.md
children:
- file: security/ai/large-language-model-performance-matrix.md
- file: security/ai/connect-to-azure-openai.md
- file: security/ai/connect-to-amazon-bedrock.md
- file: security/ai/connect-to-openai.md
- file: security/ai/connect-to-google-vertex.md
- file: security/ai/connect-to-own-local-llm.md
- file: security/ai/connect-to-vLLM.md
- file: security/ai/use-cases.md
Expand Down