Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

elasticsearchexporter failed to connect to elastic cloud #29689

Closed
smallc2009 opened this issue Dec 7, 2023 · 16 comments
Closed

elasticsearchexporter failed to connect to elastic cloud #29689

smallc2009 opened this issue Dec 7, 2023 · 16 comments
Labels

Comments

@smallc2009
Copy link

Component(s)

exporter/elasticsearch

What happened?

Description

my environment is hosted on the EKS 1.26.0 cluster and elasticsearch is 8.10 on the elastic cloud. I'm using elasticsearch exporter to send log to the elastic cloud. I'm getting errors that it keep dialing 10.46.48.34:18422 and timeout. I don't know where this IP come from.

below is error message.

github.com/elastic/go-elasticsearch/v7/esutil.(*worker).flush
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:553
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).init.func1
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:321
2023-12-07T12:41:55.688Z error elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:49 Request failed. {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log", "reason": "dial tcp 10.46.48.82:18722: i/o timeout"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*clientLogger).LogRoundTrip
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:49
github.com/elastic/go-elasticsearch/v7/estransport.(*Client).logRoundTrip
github.com/elastic/go-elasticsearch/v7@v7.17.10/estransport/estransport.go:576
github.com/elastic/go-elasticsearch/v7/estransport.(*Client).Perform
github.com/elastic/go-elasticsearch/v7@v7.17.10/estransport/estransport.go:372
github.com/elastic/go-elasticsearch/v7/esapi.InfoRequest.Do
github.com/elastic/go-elasticsearch/v7@v7.17.10/esapi/api.info.go:117
github.com/elastic/go-elasticsearch/v7.(*Client).productCheck
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:350
github.com/elastic/go-elasticsearch/v7.(*Client).doProductCheck
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:337
github.com/elastic/go-elasticsearch/v7.(*Client).Perform
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:298
github.com/elastic/go-elasticsearch/v7/esapi.BulkRequest.Do
github.com/elastic/go-elasticsearch/v7@v7.17.10/esapi/api.bulk.go:188
github.com/elastic/go-elasticsearch/v7/esutil.(*worker).flush
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:553
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).init.func1
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:321
2023-12-07T12:42:25.868Z error elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:49 Request failed. {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log", "reason": "dial tcp 10.46.48.34:18422: i/o timeout"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*clientLogger).LogRoundTrip
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:49
github.com/elastic/go-elasticsearch/v7/estransport.(*Client).logRoundTrip
github.com/elastic/go-elasticsearch/v7@v7.17.10/estransport/estransport.go:576
github.com/elastic/go-elasticsearch/v7/estransport.(*Client).Perform
github.com/elastic/go-elasticsearch/v7@v7.17.10/estransport/estransport.go:372
github.com/elastic/go-elasticsearch/v7/esapi.InfoRequest.Do
github.com/elastic/go-elasticsearch/v7@v7.17.10/esapi/api.info.go:117
github.com/elastic/go-elasticsearch/v7.(*Client).productCheck
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:350
github.com/elastic/go-elasticsearch/v7.(*Client).doProductCheck
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:337
github.com/elastic/go-elasticsearch/v7.(*Client).Perform
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:298
github.com/elastic/go-elasticsearch/v7/esapi.BulkRequest.Do
github.com/elastic/go-elasticsearch/v7@v7.17.10/esapi/api.bulk.go:188
github.com/elastic/go-elasticsearch/v7/esutil.(*worker).flush
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:553
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).init.func1
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:321
2023-12-07T12:42:56.259Z error elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:49 Request failed. {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log", "reason": "dial tcp 10.46.48.237:18726: i/o timeout"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*clientLogger).LogRoundTrip
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:49
github.com/elastic/go-elasticsearch/v7/estransport.(*Client).logRoundTrip
github.com/elastic/go-elasticsearch/v7@v7.17.10/estransport/estransport.go:576
github.com/elastic/go-elasticsearch/v7/estransport.(*Client).Perform
github.com/elastic/go-elasticsearch/v7@v7.17.10/estransport/estransport.go:372
github.com/elastic/go-elasticsearch/v7/esapi.InfoRequest.Do
github.com/elastic/go-elasticsearch/v7@v7.17.10/esapi/api.info.go:117
github.com/elastic/go-elasticsearch/v7.(*Client).productCheck
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:350
github.com/elastic/go-elasticsearch/v7.(*Client).doProductCheck
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:337
github.com/elastic/go-elasticsearch/v7.(*Client).Perform
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:298
github.com/elastic/go-elasticsearch/v7/esapi.BulkRequest.Do
github.com/elastic/go-elasticsearch/v7@v7.17.10/esapi/api.bulk.go:188
github.com/elastic/go-elasticsearch/v7/esutil.(*worker).flush
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:553
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).init.func1
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:321
2023-12-07T12:42:56.582Z error elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:150 Bulk indexer error: flush: dial tcp 10.46.48.237:18726: i/o timeout {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*worker).flush
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:557
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).init.func1
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:321
2023-12-07T12:42:56.582Z error elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:150 Bulk indexer error: flush: dial tcp 10.46.48.237:18726: i/o timeout {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).init.func1
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:324
2023-12-07T12:43:26.583Z error elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:49 Request failed. {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log", "reason": "dial tcp 10.46.48.237:18726: i/o timeout"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*clientLogger).LogRoundTrip
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:49
github.com/elastic/go-elasticsearch/v7/estransport.(*Client).logRoundTrip
github.com/elastic/go-elasticsearch/v7@v7.17.10/estransport/estransport.go:576
github.com/elastic/go-elasticsearch/v7/estransport.(*Client).Perform
github.com/elastic/go-elasticsearch/v7@v7.17.10/estransport/estransport.go:372
github.com/elastic/go-elasticsearch/v7/esapi.InfoRequest.Do
github.com/elastic/go-elasticsearch/v7@v7.17.10/esapi/api.info.go:117
github.com/elastic/go-elasticsearch/v7.(*Client).productCheck
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:350
github.com/elastic/go-elasticsearch/v7.(*Client).doProductCheck
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:337
github.com/elastic/go-elasticsearch/v7.(*Client).Perform
github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:298
github.com/elastic/go-elasticsearch/v7/esapi.BulkRequest.Do
github.com/elastic/go-elasticsearch/v7@v7.17.10/esapi/api.bulk.go:188
github.com/elastic/go-elasticsearch/v7/esutil.(*worker).flush
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:553
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).init.func1
github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:321

Steps to Reproduce

Expected Result

elasticsearchexport can send logs to the elastic cloud.

Actual Result

Collector version

0.89

Environment information

Environment

EKS: 1.26

OpenTelemetry Collector configuration

mode: daemonset

presets:
  # enables the k8sattributesprocessor and adds it to the traces, metrics, and logs pipelines
  kubernetesAttributes:
    enabled: true
  # enables the kubeletstatsreceiver and adds it to the metrics pipelines
  kubeletMetrics:
    enabled: true
  # Enables the filelogreceiver and adds it to the logs pipelines
  logsCollection:
    enabled: true
    includeCollectorLogs: false
    # Enabling this writes checkpoints in /var/lib/otelcol/ host directory.
    # Note this changes collector's user to root, so that it can write to host directory.
    storeCheckpoints: false
    # The maximum bytes size of the recombined field.
    # Once the size exceeds the limit, all received entries of the source will be combined and flushed.
    maxRecombineLogSize: 102400
  hostMetrics:
    enabled: true


serviceAccount:
  # Specifies whether a service account should be created
  create: true

clusterRole:
  # Specifies whether a clusterRole should be created
  # Some presets also trigger the creation of a cluster role and cluster role binding.
  # If using one of those presets, this field is no-op.
  create: true
  # Annotations to add to the clusterRole
  # Can be used in combination with presets that create a cluster role.
  annotations: {}
  # The name of the clusterRole to use.
  # If not set a name is generated using the fullname template
  # Can be used in combination with presets that create a cluster role.
  name: ""
  # A set of rules as documented here : https://kubernetes.io/docs/reference/access-authn-authz/rbac/
  # Can be used in combination with presets that create a cluster role to add additional rules.
  rules: 
  - apiGroups:
    - '*'
    resources:
    - '*'
    verbs:
    - 'get'
    - 'list'
    - 'watch'

  clusterRoleBinding:
    # Annotations to add to the clusterRoleBinding
    # Can be used in combination with presets that create a cluster role binding.
    annotations: {}
    # The name of the clusterRoleBinding to use.
    # If not set a name is generated using the fullname template
    # Can be used in combination with presets that create a cluster role binding.
    name: ""

## The chart only includes the loggingexporter by default
## If you want to send your data somewhere you need to
## configure an exporter, such as the otlpexporter
config:
  receivers:
    filelog:
      include:
        - /var/log/pods/*/*/*.log
      exclude:
        # Exclude logs from all containers named otel-collector
        - /var/log/pods/*/otel-collector/*.log
      start_at: beginning
      include_file_path: true
      include_file_name: false   
      operators:
        # Find out which format is used by kubernetes
        - type: router
          id: get-format
          routes:
            - output: parser-docker
              expr: 'body matches "^\\{"'
            - output: parser-crio
              expr: 'body matches "^[^ Z]+ "'
            - output: parser-containerd
              expr: 'body matches "^[^ Z]+Z"'
        # Parse CRI-O format
        - type: regex_parser
          id: parser-crio
          regex: '^(?P<time>[^ Z]+) (?P<stream>stdout|stderr) (?P<logtag>[^ ]*) ?(?P<log>.*)$'
          timestamp:
            parse_from: attributes.time
            layout_type: gotime
            layout: '2006-01-02T15:04:05.999999999Z07:00'
        - type: recombine
          id: crio-recombine
          output: extract_metadata_from_filepath
          combine_field: attributes.log
          source_identifier: attributes["log.file.path"]
          is_last_entry: "attributes.logtag == 'F'"
          combine_with: ""
          max_log_size: 10400
        # Parse CRI-Containerd format
        - type: regex_parser
          id: parser-containerd
          regex: '^(?P<time>[^ ^Z]+Z) (?P<stream>stdout|stderr) (?P<logtag>[^ ]*) ?(?P<log>.*)$'
          timestamp:
            parse_from: attributes.time
            layout: '%Y-%m-%dT%H:%M:%S.%LZ'
        - type: recombine
          id: containerd-recombine
          output: extract_metadata_from_filepath
          combine_field: attributes.log
          source_identifier: attributes["log.file.path"]
          is_last_entry: "attributes.logtag == 'F'"
          combine_with: ""
          max_log_size: 10400
        # Parse Docker format
        - type: json_parser
          id: parser-docker
          output: extract_metadata_from_filepath
          timestamp:
            parse_from: attributes.time
            layout: '%Y-%m-%dT%H:%M:%S.%LZ'
        # Extract metadata from file path
        - type: regex_parser
          id: extract_metadata_from_filepath
          regex: '^.*\/(?P<namespace>[^_]+)_(?P<pod_name>[^_]+)_(?P<uid>[a-f0-9\-]+)\/(?P<container_name>[^\._]+)\/(?P<restart_count>\d+)\.log$'
          parse_from: attributes["log.file.path"]
        # Rename attributes
        - type: move
          from: attributes.stream
          to: attributes["log.iostream"]
        - type: move
          from: attributes.container_name
          to: resource["container_name"]
        - type: move
          from: attributes.namespace
          to: resource["namespace"]
        - type: move
          from: attributes.pod_name
          to: resource["pod_name"]
        - type: move
          from: attributes.restart_count
          to: resource["container_restart_count"]
        - type: move
          from: attributes.uid
          to: resource["k8s_pod_uid"]
        # Clean up log body
        - type: move
          from: attributes.log
          to: body

  processors:
    memory_limiter:
      check_interval: 1s
      limit_mib: 2000
    batch: {}
  
    # k8sattributes processor to get the metadata from K8s
    k8sattributes:
      passthrough: false
      extract:
        metadata:
        - k8s.namespace.name
        - k8s.deployment.name
        - k8s.statefulset.name
        - k8s.daemonset.name
        - k8s.cronjob.name
        - k8s.job.name
        - k8s.node.name
        - k8s.pod.name
        - k8s.pod.uid
        - k8s.pod.start_time
      # Pod association using resource attributes and connection
      pod_association:
      - sources:
        - from: resource_attribute
          name: k8s.pod.ip
      - sources:
        - from: resource_attribute
          name: k8s.pod.uid
      - sources:
        - from: connection          
  exporters:
    logging:
      loglevel: debug
    debug:
       verbosity: detailed
    elasticsearch/trace:
      tls:
        insecure: false
        insecure_skip_verify: true
      endpoints: [https://quickstart-es-http:9200]
      timeout: 2m
      headers:
        myheader: test
      traces_index: trace_index
      api_key: "aEluOF9Zc0JTYTJIQm9QanR4b1"
      discover:
        on_start: true
      flush:
        bytes: 10485760
      retry:
        max_requests: 5
    elasticsearch/log:
      endpoints: ["https://otel-demo.es.gcp.elastic-cloud.com:443"]
      logs_index: app_log_index
      mapping:
        mode: ecs
      timeout: 2m
      headers:
        myheader: test
      api_key: "ZnRrdw=="
      discover:
        on_start: true
      flush:
        bytes: 10485760
      retry:
        max_requests: 5
      sending_queue:
        enabled: true     
  service:
    pipelines:
      logs:
        exporters:
          - elasticsearch/log
        processors:
          - memory_limiter
          - k8sattributes
          - batch
        receivers:
          - filelog
      metrics:
        exporters:
          - otlp/elastic
        processors:
          - k8sattributes
          - memory_limiter
          - batch
        receivers:
          - hostmetrics
          - kubeletstats
resources:
  limits:
    cpu: 200m
    memory: 800M

Log output

github.com/elastic/go-elasticsearch/v7.(*Client).Perform
    github.com/elastic/go-elasticsearch/v7@v7.17.10/elasticsearch.go:298
github.com/elastic/go-elasticsearch/v7/esapi.BulkRequest.Do
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esapi/api.bulk.go:188
github.com/elastic/go-elasticsearch/v7/esutil.(*worker).flush
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:553
github.com/elastic/go-elasticsearch/v7/esutil.(*worker).run.func1
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:386
2023-12-07T12:51:09.561Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:09.761Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:09.962Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:10.163Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:14.778Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:14.979Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:15.180Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:15.380Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:16.784Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:17.586Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:19.994Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:20.196Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:20.397Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46
2023-12-07T12:51:20.597Z    error    elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150    Bulk indexer error: context deadline exceeded    {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.newBulkIndexer.func1
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:150
github.com/elastic/go-elasticsearch/v7/esutil.(*bulkIndexer).Add
    github.com/elastic/go-elasticsearch/v7@v7.17.10/esutil/bulk_indexer.go:232
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.pushDocuments
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/elasticsearch_bulk.go:225
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogRecord
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:116
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter.(*elasticsearchLogsExporter).pushLogsData
    github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter@v0.89.0/logs_exporter.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:58
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/timeout_sender.go:38
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/common.go:33
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/logs.go:173
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/queue_sender.go:141
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1
    go.opentelemetry.io/collector/exporter@v0.89.0/exporterhelper/internal/bounded_memory_queue.go:46

Additional context

No response

@smallc2009 smallc2009 added bug Something isn't working needs triage New item requiring triage labels Dec 7, 2023
Copy link
Contributor

github-actions bot commented Dec 7, 2023

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@JaredTan95
Copy link
Member

thx, I will check it ASAP

@JaredTan95
Copy link
Member

8.10

elasticsearch exporter not support 8.x version for now.

@devamanv
Copy link

devamanv commented Jan 12, 2024

elasticsearch exporter not support 8.x version for now.

@JaredTan95 I am looking for issues as a first-time contributor to OTel. Could I take this up and try to add support for ES 8.x? If yes, could you please help me get started. Any pointers will be helpful. Thanks.

@crobert-1 crobert-1 removed the needs triage New item requiring triage label Jan 17, 2024
@crobert-1
Copy link
Member

@JaredTan95 Could you clarify which versions are supported? It would be good to document this if possible.

@crobert-1 crobert-1 added documentation Improvements or additions to documentation and removed bug Something isn't working labels Jan 17, 2024
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Mar 18, 2024
@marcomusso
Copy link

/label waiting-for-author

Not being able to send to Elasticsearch 8.x means this exporter is unusable for many users. I'll happily test it if needed but I don't have time to prepare a proper PR, sorry.

@github-actions github-actions bot removed the Stale label Mar 20, 2024
@carsonip
Copy link
Contributor

carsonip commented Apr 8, 2024

I do not experience any issues with sending to Elastic cloud 8.x using elasticsearch exporter.

Setup:

ocb-config-main.yaml:

dist:
  module: github.com/open-telemetry/opentelemetry-collector # the module name for the new distribution, following Go mod conventions. Optional, but recommended.
  name: collector # the binary name. Optional.
  description: "Custom OpenTelemetry Collector distribution" # a long name for the application. Optional.
  otelcol_version: "0.96.0" # the OpenTelemetry Collector version to use as base for the distribution. Optional.
  output_path: ./build_main/ # the path to write the output (sources and binary). Optional.
  version: "1.0.0" # the version for your custom OpenTelemetry Collector. Optional.
#  go: "/usr/bin/go" # which Go binary to use to compile the generated sources. Optional.
#  debug_compilation: false # enabling this causes the builder to keep the debug symbols in the resulting binary. Optional.
exporters:
  - gomod: "github.com/open-telemetry/opentelemetry-collector-contrib/exporter/elasticsearchexporter v0.96.0" # the Go module for the component. Required.
receivers:
  - gomod:
      go.opentelemetry.io/collector/receiver/otlpreceiver v0.96.0

command to build otel collector:

ocb --config=ocb-config-main.yaml --name="my-otelcol"

otelcol-main.yaml:

receivers:
  otlp:
    protocols:
      grpc:
        endpoint: localhost:4317
      http:
        endpoint: localhost:4318

exporters:
  elasticsearch:
    endpoints: [ "https://**redacted**.cloud.es.io" ]
    logs_index: foo
    api_key: **redacted**
    retry:
      enabled: true
      max_requests: 10000
service:
  pipelines:
    logs:
      receivers: [otlp]
      processors: []
      exporters: [elasticsearch]

command to run the collector:

./build_main/my-otelcol --config otelcol-main.yaml

In another terminal, run the command to send sample logs:

telemetrygen logs --otlp-endpoint=localhost:4317 --otlp-insecure --logs 100

In kibana, there are 100 logs indexed:
Screenshot 2024-04-08 at 13-47-55 Discover - Elastic

I'm getting errors that it keep dialing 10.46.48.34:18422 and timeout.
2023-12-07T12:42:56.582Z error elasticsearchexporter@v0.90.1/elasticsearch_bulk.go:150 Bulk indexer error: flush: dial tcp 10.46.48.237:18726: i/o timeout {"kind": "exporter", "data_type": "logs", "name": "elasticsearch/log"}

The error in this issue appears to be a connectivity issue rather than a bug in elasticsearch exporter.

elasticsearch exporter not support 8.x version for now.

@JaredTan95 do you mind clarifying what is not supported?

@marcomusso
Copy link

marcomusso commented Apr 8, 2024

In our tests (to bridge AWS CloudWatch logs to Elastic self-hosted v8) we had to build the collector with go-elasticsearch v8 (basically waiting for this PR to be merged).
PS: sorry I don't have the actual error returned but I remember it was related to not being able to send documents to Elastic from go-elasticsearch v7 to our v8 API endpoint

@carsonip
Copy link
Contributor

carsonip commented Apr 8, 2024

In our tests (to bridge AWS CloudWatch logs to Elastic self-hosted v8) we had to build the collector with go-elasticsearch v8 (basically waiting for this PR to be merged). PS: sorry I don't have the actual error returned but I remember it was related to not being able to send documents to Elastic from go-elasticsearch v7 to our v8 API endpoint

It would be helpful to get the actual error log in your case since I am not able to reproduce the issue.

On a separate note, since supposedly go-elasticsearch v7 works for both v7 and v8, and that upgrading to go-elasticsearch v8 will break support for v7 (see issue), I don't see #30262 getting merged soon. However, if you could show some errors in your use case, that could help make a case for adding support for specifically v8 (via a feature flag maybe?)

@marcomusso
Copy link

Error was context deadline exceeded (exporter 0.94), solved by compiling with the go module for v8.

Screenshot 2024-04-08 at 18 10 12

@carsonip
Copy link
Contributor

carsonip commented Apr 8, 2024

Error was context deadline exceeded (exporter 0.94), solved by compiling with the go module for v8.

Thanks, the screenshot is very helpful.

Here's my hypothesis: as you can see from the stacktrace, there's a timeout_sender.go:49. Apparently, there's a default timeout of 5s for request export, while the exporter code has a default 90s timeout passed into go-elasticsearch bulk indexer. This means that by default, if Elasticsearch takes >5s to respond, the context will reach its deadline before go-elasticsearch http client gives up, hence the "context deadline exceeded" error. It is a combination of bad hardcoded defaults and a slow Elasticsearch.

As to why upgrading to go-elasticsearch v8 solves your issue, it can be either some changes within go-elasticsearch v8, or how go-elasticsearch is used. Do you mind sharing the exact code changes you've made to upgrade to go-elasticsearch v8 from v7, as well as the collector configuration (redact sensitive information please)? It must be more than just a go mod replace since there is code that references v7 explicitly.

It could also be that certain errors are retried in v7 but not v8 such that the retries in v7 use >5s and cause context deadline to be exceeded before bulk indexer can finish flushing.

@marcomusso
Copy link

I doubt our Elasticsearch API takes more than 5s but I lack the data for when the test were conducted, I'll ask my colleague to chip in and provide some more context as soon as possible.

Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Jun 11, 2024
@crobert-1 crobert-1 removed the Stale label Jun 11, 2024
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Aug 12, 2024
Copy link
Contributor

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Oct 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants