Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DOCS-7981] Add Datadog Agent source to log volume control #23226

Merged
merged 2 commits into from
May 23, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
add dd agent source
  • Loading branch information
maycmlee committed May 16, 2024
commit 0c4b1f46359bbba2634425578adc611894cf1196
11 changes: 8 additions & 3 deletions config/_default/menus/main.en.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3910,21 +3910,26 @@ menu:
parent: observability_pipelines
identifier: observability_pipelines_log_volume_control
weight: 2
- name: Datadog Agent
url: observability_pipelines/log_volume_control/datadog_agent/
parent: observability_pipelines_log_volume_control
identifier: observability_pipelines_log_volume_control_datadog_agent
weight: 2001
- name: Splunk HTTP Event Collector
url: observability_pipelines/log_volume_control/splunk_hec/
parent: observability_pipelines_log_volume_control
identifier: observability_pipelines_log_volume_control_splunk_hec
weight: 2001
weight: 2002
- name: Splunk Forwarders (TCP)
url: observability_pipelines/log_volume_control/splunk_tcp/
parent: observability_pipelines_log_volume_control
identifier: observability_pipelines_log_volume_control_splunk_tcp
weight: 2002
weight: 2003
- name: Sumo Logic Hosted Collector
url: observability_pipelines/log_volume_control/sumo_logic_hosted_collector/
parent: observability_pipelines_log_volume_control
identifier: observability_pipelines_log_volume_control_sumo_logic_hosted_collector
weight: 2003
weight: 2004
- name: Dual Ship Logs
url: observability_pipelines/dual_ship_logs/
parent: observability_pipelines
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,12 @@ As your infrastructure and applications grow, so does your log volume and the co

Select a log source to get started:

- [Datadog Agent][4]
- [Splunk HTTP Event Collector (HEC)][1]
- [Splunk Heavy and Universal Forwarders (TCP)][2]
- [Sumo Logic Hosted Collector][3]

[1]: /observability_pipelines/log_volume_control/splunk_hec
[2]: /observability_pipelines/log_volume_control/splunk_tcp
[3]: /observability_pipelines/log_volume_control/sumo_logic_hosted_collector
[3]: /observability_pipelines/log_volume_control/sumo_logic_hosted_collector
[4]: /observability_pipelines/log_volume_control/datadog_agent
Original file line number Diff line number Diff line change
@@ -0,0 +1,165 @@
---
title: Log Volume Control for the Datadog Agent
kind: document
disable_toc: false
---

## Overview

Set up the Observability Pipelines Worker with the Datadog Agent source so that you route only useful logs to your destinations.

{{< img src="observability_pipelines/use_cases/log_volume_control.png" alt="The log sources, processors, and destinations available for this use case" width="100%" >}}

This document walks you through the following steps:
1. The [prerequisites](#prerequisites) needed to set up Observability Pipelines
1. [Setting up Observability Pipelines](#set-up-observability-pipelines)
1. [Connecting the Datadog Agent to the Observability Pipelines Worker](#connect-the-datadog-agent-to-the-observability-pipelines-worker)

## Prerequisites

{{% observability_pipelines/prerequisites/datadog_agent %}}

{{< tabs >}}
{{% tab "Splunk HEC" %}}

{{% observability_pipelines/prerequisites/splunk_hec %}}

{{% /tab %}}
{{% tab "Sumo Logic" %}}

{{% observability_pipelines/prerequisites/sumo_logic %}}

{{% /tab %}}
{{< /tabs >}}

## Set up Observability Pipelines

1. Navigate to [Observability Pipelines][1].
1. Select the **Log Volume Control** template to create a new pipeline.
1. Select **Datadog Agent** as the source.

### Set up the source

{{% observability_pipelines/source_settings/datadog_agent %}}

### Set up the destinations

Enter the following information based on your selected logs destination.

{{< tabs >}}
{{% tab "Datadog" %}}

{{% observability_pipelines/destination_settings/datadog %}}

{{% /tab %}}
{{% tab "Splunk HEC" %}}

{{% observability_pipelines/destination_settings/splunk_hec %}}

{{% /tab %}}
{{% tab "Sumo Logic" %}}

{{% observability_pipelines/destination_settings/sumo_logic %}}

{{% /tab %}}
{{< /tabs >}}

### Set up processors

{{% observability_pipelines/processors/intro %}}

{{% observability_pipelines/processors/filter_syntax %}}

{{% observability_pipelines/processors/add_processors %}}

{{< tabs >}}
{{% tab "Filter" %}}

{{% observability_pipelines/processors/filter %}}

{{% /tab %}}
{{% tab "Sample" %}}

{{% observability_pipelines/processors/sample %}}

{{% /tab %}}
{{% tab "Quota" %}}

{{% observability_pipelines/processors/quota %}}

{{% /tab %}}
{{% tab "Dedupe" %}}

{{% observability_pipelines/processors/dedupe %}}

{{% /tab %}}
{{% tab "Edit fields" %}}

{{% observability_pipelines/processors/remap %}}

{{% /tab %}}
{{< /tabs >}}

### Install the Observability Pipelines Worker
1. Select your platform in the **Choose your installation platform** dropdown menu.
1. Enter the Datadog Agent address. This is the address and port where your Datadog Agent is sending its logging data. The Observability Pipelines Worker listens to this address for incoming logs.
1. Provide the environment variables for each of your selected destinations.
{{< tabs >}}
{{% tab "Datadog" %}}

{{% observability_pipelines/destination_env_vars/datadog %}}

{{% /tab %}}
{{% tab "Splunk HEC" %}}

{{% observability_pipelines/destination_env_vars/splunk_hec %}}

{{% /tab %}}
{{% tab "Sumo Logic" %}}

{{% observability_pipelines/destination_env_vars/sumo_logic %}}

{{% /tab %}}
{{< /tabs >}}
1. Follow the instructions for your environment to install the Worker.
{{< tabs >}}
{{% tab "Docker" %}}

{{% observability_pipelines/install_worker/docker %}}

{{% /tab %}}
{{% tab "Amazon EKS" %}}

{{% observability_pipelines/install_worker/amazon_eks %}}

{{% /tab %}}
{{% tab "Azure AKS" %}}

{{% observability_pipelines/install_worker/azure_aks %}}

{{% /tab %}}
{{% tab "Google GKE" %}}

{{% observability_pipelines/install_worker/google_gke %}}

{{% /tab %}}
{{% tab "Linux (APT)" %}}

{{% observability_pipelines/install_worker/linux_apt %}}

{{% /tab %}}
{{% tab "Linux (RPM)" %}}

{{% observability_pipelines/install_worker/linux_rpm %}}

{{% /tab %}}
{{% tab "CloudFormation" %}}

{{% observability_pipelines/install_worker/cloudformation %}}

{{% /tab %}}
{{< /tabs >}}

{{% observability_pipelines/log_source_configuration/datadog_agent %}}

[1]: https://app.datadoghq.com/observability-pipelines
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ This document walks you through the following steps:
1. [Setting up Observability Pipelines](#set-up-observability-pipelines)
1. [Sending logs to the Worker over Splunk HEC](#send-logs-to-the-observability-pipelines-worker-over-splunk-hec)

{{< img src="observability_pipelines/use_cases/log_volume_control.png" alt="The log sources, processors, and destinations available for the split logs use case" width="100%" >}}
{{< img src="observability_pipelines/use_cases/log_volume_control.png" alt="The log sources, processors, and destinations available for this use case" width="100%" >}}

## Prerequisites

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ This document walks you through the following steps to set up the Observability
1. [Setting up Observability Pipelines](#set-up-observability-pipelines)
1. [Connecting Splunk Forwarder to the Observability Pipelines Worker](#connect-splunk-forwarder-to-the-observability-pipelines-worker)

{{< img src="observability_pipelines/use_cases/log_volume_control.png" alt="The log sources, processors, and destinations available for the split logs use case" width="100%" >}}
{{< img src="observability_pipelines/use_cases/log_volume_control.png" alt="The log sources, processors, and destinations available for this use case" width="100%" >}}

## Prerequisites

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ This document walks you through the following steps to set up the Observability
1. [Setting up Observability Pipelines](#set-up-observability-pipelines)
1. [Sending logs to the Observability Pipelines Worker over Sumo Logic HTTP Source](#send-logs-to-the-observability-pipelines-worker-over-sumo-logic-http-source)

{{< img src="observability_pipelines/use_cases/log_volume_control.png" alt="The log sources, processors, and destinations available for the split logs use case" width="100%" >}}
{{< img src="observability_pipelines/use_cases/log_volume_control.png" alt="The log sources, processors, and destinations available for this use case" width="100%" >}}

## Prerequisites

Expand Down
Loading