Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use container parser of stanza for logs collection #1195

Merged
merged 1 commit into from
Jun 25, 2024

Conversation

ChrsMark
Copy link
Member

@ChrsMark ChrsMark commented May 21, 2024

Fixes #1188.

This is a draft until the v0.102.0 version of Collector is released, which will include the new max_log_size setting of the parser.

Testing notes

  1. helm install daemonset ./opentelemetry-collector --set presets.logsCollection.enabled=true --set image.repository="otel/opentelemetry-collector-k8s" --set command.name="otelcol-k8s" --set mode=daemonset
  2. k get cm daemonset-opentelemetry-collector-agent -o jsonpath='{.data.relay}:
exporters:
  debug: {}
extensions:
  health_check:
    endpoint: ${env:MY_POD_IP}:13133
processors:
  batch: {}
  memory_limiter:
    check_interval: 5s
    limit_percentage: 80
    spike_limit_percentage: 25
receivers:
  filelog:
    exclude:
    - /var/log/pods/default_daemonset-opentelemetry-collector*_*/opentelemetry-collector/*.log
    include:
    - /var/log/pods/*/*/*.log
    include_file_name: false
    include_file_path: true
    operators:
    - id: container-parser
      max_log_size: 102400
      type: container
    retry_on_failure:
      enabled: true
    start_at: end
  jaeger:
    protocols:
      grpc:
        endpoint: ${env:MY_POD_IP}:14250
      thrift_compact:
        endpoint: ${env:MY_POD_IP}:6831
      thrift_http:
        endpoint: ${env:MY_POD_IP}:14268
  otlp:
    protocols:
      grpc:
        endpoint: ${env:MY_POD_IP}:4317
      http:
        endpoint: ${env:MY_POD_IP}:4318
  prometheus:
    config:
      scrape_configs:
      - job_name: opentelemetry-collector
        scrape_interval: 10s
        static_configs:
        - targets:
          - ${env:MY_POD_IP}:8888
  zipkin:
    endpoint: ${env:MY_POD_IP}:9411
service:
  extensions:
  - health_check
  pipelines:
    logs:
      exporters:
      - debug
      processors:
      - memory_limiter
      - batch
      receivers:
      - otlp
      - filelog
    metrics:
      exporters:
      - debug
      processors:
      - memory_limiter
      - batch
      receivers:
      - otlp
      - prometheus
    traces:
      exporters:
      - debug
      processors:
      - memory_limiter
      - batch
      receivers:
      - otlp
      - jaeger
      - zipkin
  telemetry:
    metrics:
      address: ${env:MY_POD_IP}:8888

@ChrsMark ChrsMark marked this pull request as ready for review May 22, 2024 09:37
@ChrsMark ChrsMark requested a review from a team May 22, 2024 09:37
@ChrsMark
Copy link
Member Author

Tested e2e as well on kind v0.20.0 go1.20.4 linux/amd64, k8s v1.27.3, runtime containerd.
Sample records:

2024-05-22T09:35:07.700Z	info	LogsExporter	{"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 2}
2024-05-22T09:35:07.700Z	info	ResourceLog #0
Resource SchemaURL: 
ScopeLogs #0
ScopeLogs SchemaURL: 
InstrumentationScope  
LogRecord #0
ObservedTimestamp: 2024-05-22 09:35:07.572874147 +0000 UTC
Timestamp: 2024-05-22 09:35:07.383200464 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str(otel logs at 09:35:07)
Attributes:
     -> log.file.path: Str(/var/log/pods/default_daemonset-logs-kk9m8_d5f6af9d-6ff6-4ba4-a8bb-dab9fea6838c/busybox/0.log)
     -> k8s.container.name: Str(busybox)
     -> k8s.namespace.name: Str(default)
     -> k8s.pod.name: Str(daemonset-logs-kk9m8)
     -> k8s.container.restart_count: Str(0)
     -> log.iostream: Str(stdout)
     -> logtag: Str(F)
     -> k8s.pod.uid: Str(d5f6af9d-6ff6-4ba4-a8bb-dab9fea6838c)
     -> time: Str(2024-05-22T09:35:07.383200464Z)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #1
ObservedTimestamp: 2024-05-22 09:35:07.572974215 +0000 UTC
Timestamp: 2024-05-22 09:35:07.485134813 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str(otel logs at 09:35:07)
Attributes:
     -> k8s.pod.name: Str(daemonset-logs-kk9m8)
     -> k8s.pod.uid: Str(d5f6af9d-6ff6-4ba4-a8bb-dab9fea6838c)
     -> log.file.path: Str(/var/log/pods/default_daemonset-logs-kk9m8_d5f6af9d-6ff6-4ba4-a8bb-dab9fea6838c/busybox/0.log)
     -> time: Str(2024-05-22T09:35:07.485134813Z)
     -> log.iostream: Str(stdout)
     -> logtag: Str(F)
     -> k8s.namespace.name: Str(default)
     -> k8s.container.restart_count: Str(0)
     -> k8s.container.name: Str(busybox)
Trace ID: 
Span ID: 
Flags: 0
	{"kind": "exporter", "data_type": "logs", "name": "debug"}
2024-05-22T09:35:07.901Z	info	ResourceLog #0
Resource SchemaURL: 
ScopeLogs #0
ScopeLogs SchemaURL: 
InstrumentationScope  
LogRecord #0
ObservedTimestamp: 2024-05-22 09:35:07.772023047 +0000 UTC
Timestamp: 2024-05-22 09:35:07.587542234 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str(otel logs at 09:35:07)
Attributes:
     -> k8s.pod.name: Str(daemonset-logs-kk9m8)
     -> k8s.container.restart_count: Str(0)
     -> k8s.pod.uid: Str(d5f6af9d-6ff6-4ba4-a8bb-dab9fea6838c)
     -> k8s.container.name: Str(busybox)
     -> log.file.path: Str(/var/log/pods/default_daemonset-logs-kk9m8_d5f6af9d-6ff6-4ba4-a8bb-dab9fea6838c/busybox/0.log)
     -> log.iostream: Str(stdout)
     -> time: Str(2024-05-22T09:35:07.587542234Z)
     -> k8s.namespace.name: Str(default)
     -> logtag: Str(F)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #1
ObservedTimestamp: 2024-05-22 09:35:07.772149911 +0000 UTC
Timestamp: 2024-05-22 09:35:07.689592419 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str(otel logs at 09:35:07)
Attributes:
     -> k8s.pod.uid: Str(d5f6af9d-6ff6-4ba4-a8bb-dab9fea6838c)
     -> k8s.container.name: Str(busybox)
     -> k8s.namespace.name: Str(default)
     -> k8s.pod.name: Str(daemonset-logs-kk9m8)
     -> time: Str(2024-05-22T09:35:07.689592419Z)
     -> log.iostream: Str(stdout)
     -> logtag: Str(F)
     -> log.file.path: Str(/var/log/pods/default_daemonset-logs-kk9m8_d5f6af9d-6ff6-4ba4-a8bb-dab9fea6838c/busybox/0.log)
     -> k8s.container.restart_count: Str(0)
Trace ID: 
Span ID: 
Flags: 0

djaglowski pushed a commit to open-telemetry/opentelemetry-collector-contrib that referenced this pull request May 22, 2024
)

**Description:** <Describe what has changed.>
<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue.
Ex. Adding a feature - Explain what this achieves.-->
The
[logsCollection](https://github.com/open-telemetry/opentelemetry-helm-charts/blob/ef0e1ac4f645cdbb9bd0108c76b1ed69e418430c/charts/opentelemetry-collector/values.yaml#L29C3-L37)
Helm preset provides the option to set the `maxRecombineLogSize`.
The `container` parser does not expose this option but rather sets it to
`102400` by default internally.
This PR is to make this option configurable so that the parser can be
used in the [Helm preset
seamlessly](open-telemetry/opentelemetry-helm-charts#1195).

**Link to tracking Issue:** <Issue number if applicable>

**Testing:** <Describe what testing was performed and which tests were
added.> Added.

**Documentation:** <Describe the documentation added.> Updated.

Signed-off-by: ChrsMark <chrismarkou92@gmail.com>
@ChrsMark ChrsMark changed the title Use container parser of stanza for logs collection [wip] Use container parser of stanza for logs collection May 22, 2024
@ChrsMark
Copy link
Member Author

This should not get merged until the v0.102.0 version of Collector is released, which will include the new max_log_size setting of the parser.

Copy link

github-actions bot commented Jun 6, 2024

This PR was marked stale due to lack of activity. It will be closed in 14 days.

@github-actions github-actions bot added the Stale label Jun 6, 2024
@ChrsMark
Copy link
Member Author

ChrsMark commented Jun 6, 2024

/label -Stale

@TylerHelmuth
Copy link
Member

@ChrsMark v0.102.1 is out, this is ready to be picked up again

@ChrsMark
Copy link
Member Author

ChrsMark commented Jun 6, 2024

There was an additional fix for the parser that did not make it to v0.102.0: open-telemetry/opentelemetry-collector-contrib#33353

I would suggest we wait for it, before doing the switch.

@github-actions github-actions bot removed the Stale label Jun 7, 2024
Copy link

This PR was marked stale due to lack of activity. It will be closed in 14 days.

@github-actions github-actions bot added the Stale label Jun 22, 2024
@ChrsMark
Copy link
Member Author

Not stale.

Signed-off-by: ChrsMark <chrismarkou92@gmail.com>
@ChrsMark
Copy link
Member Author

@TylerHelmuth finally the fixes are in. This one should be ready for a review now.

Test the produced configmap

  1. Run helm install daemonset ./charts/opentelemetry-collector --set presets.logsCollection.enabled=true --set image.repository="otel/opentelemetry-collector-k8s" --set command.name="otelcol-k8s" --set mode=daemonset --dry-run

  2. Verify the produced config:

    receivers:
      filelog:
        exclude:
        - /var/log/pods/default_daemonset-opentelemetry-collector*_*/opentelemetry-collector/*.log
        include:
        - /var/log/pods/*/*/*.log
        include_file_name: false
        include_file_path: true
        operators:
        - id: container-parser
          max_log_size: 102400
          type: container
        retry_on_failure:
          enabled: true
        start_at: end

Verify the produced log records

  1. helm install daemonset ./charts/opentelemetry-collector --set presets.logsCollection.enabled=true --set image.repository="otel/opentelemetry-collector-k8s" --set command.name="otelcol-k8s" --set mode=daemonset --set config.exporters.debug.verbosity="detailed"

  2. Verify that Resource attribute and log attributes are populated properly:

2024-06-25T08:15:12.981Z	info	ResourceLog #0
Resource SchemaURL: 
Resource attributes:
     -> k8s.namespace.name: Str(default)
     -> k8s.pod.name: Str(daemonset-logs-8w7rm)
     -> k8s.container.restart_count: Str(0)
     -> k8s.pod.uid: Str(539b21bf-0b23-4929-ad61-a5c76820b309)
     -> k8s.container.name: Str(busybox)
ScopeLogs #0
ScopeLogs SchemaURL: 
InstrumentationScope  
LogRecord #0
ObservedTimestamp: 2024-06-25 08:15:12.739110323 +0000 UTC
Timestamp: 2024-06-25 08:15:12.606659689 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str(otel logs at 08:15:12)
Attributes:
     -> logtag: Str(F)
     -> log.file.path: Str(/var/log/pods/default_daemonset-logs-8w7rm_539b21bf-0b23-4929-ad61-a5c76820b309/busybox/0.log)
     -> time: Str(2024-06-25T08:15:12.606659689Z)
     -> log.iostream: Str(stdout)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #1
ObservedTimestamp: 2024-06-25 08:15:12.739202433 +0000 UTC
Timestamp: 2024-06-25 08:15:12.707975372 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str(otel logs at 08:15:12)
Attributes:
     -> log.file.path: Str(/var/log/pods/default_daemonset-logs-8w7rm_539b21bf-0b23-4929-ad61-a5c76820b309/busybox/0.log)
     -> time: Str(2024-06-25T08:15:12.707975372Z)
     -> log.iostream: Str(stdout)
     -> logtag: Str(F)
Trace ID: 
Span ID: 
Flags: 0
	{"kind": "exporter", "data_type": "logs", "name": "debug"}

@ChrsMark ChrsMark changed the title [wip] Use container parser of stanza for logs collection Use container parser of stanza for logs collection Jun 25, 2024
Comment on lines +208 to +209
- type: container
id: container-parser
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ChrsMark @djaglowski thanks for making this parser, its amazing.

12ushan pushed a commit to giffgaff/opentelemetry-helm-charts that referenced this pull request Jul 22, 2024
Signed-off-by: ChrsMark <chrismarkou92@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[collector] Use container parser in the filelogreceiver
2 participants