Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block] #31343

Closed
julianocosta89 opened this issue Feb 20, 2024 · 3 comments
Labels
exporter/opensearch question Further information is requested

Comments

@julianocosta89
Copy link
Member

julianocosta89 commented Feb 20, 2024

Component(s)

exporter/opensearch

What happened?

Description

Since we've added the opensearch exporter to the OpenTelemetry Demo the Collector started throwing some errors.

Steps to Reproduce

Clone the OpenTelemetry Demo repo (https://github.com/open-telemetry/opentelemetry-demo) and run:

docker compose up

Expected Result

I expect the Collector to run without throwing errors.

Actual Result

The following message keeps being triggered:

2024-02-20T13:37:08.758Z	info	exporterhelper/retry_sender.go:118	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "error": "{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}", "interval": "26.169887369s"}

Collector version

0.94.0

Environment information

Environment

OS: macOS Sonoma Version 14.3.1
Apple M1 Max

Running on Docker Desktop:
Engine: 24.0.7
Compose: v2.23.3-desktop.2

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
      grpc:
      http:
        cors:
          allowed_origins:
            - "http://*"
            - "https://*"
  httpcheck/frontendproxy:
    targets:
      - endpoint: http://frontendproxy:${env:ENVOY_PORT}

exporters:
  debug:
  otlp:
    endpoint: "jaeger:4317"
    tls:
      insecure: true
  otlphttp/prometheus:
    endpoint: "http://prometheus:9090/api/v1/otlp"
    tls:
      insecure: true
  opensearch:
    logs_index: otel
    http:
      endpoint: "http://opensearch:9200"
      tls:
        insecure: true

processors:
  batch:

connectors:
  spanmetrics:

service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [otlp, debug, spanmetrics]
    metrics:
      receivers: [httpcheck/frontendproxy, otlp, spanmetrics]
      processors: [batch]
      exporters: [otlphttp/prometheus, debug]
    logs:
      receivers: [otlp]
      processors: [batch]
      exporters: [opensearch, debug]

Log output

2024-02-20T13:35:16.745Z	info	service@v0.94.1/telemetry.go:59	Setting up own telemetry...
2024-02-20T13:35:16.745Z	info	service@v0.94.1/telemetry.go:104	Serving metrics	{"address": ":8888", "level": "Basic"}
2024-02-20T13:35:16.745Z	info	exporter@v0.94.1/exporter.go:275	Development component. May change in the future.	{"kind": "exporter", "data_type": "traces", "name": "debug"}
2024-02-20T13:35:16.745Z	info	exporter@v0.94.1/exporter.go:275	Development component. May change in the future.	{"kind": "exporter", "data_type": "metrics", "name": "debug"}
2024-02-20T13:35:16.747Z	info	spanmetricsconnector@v0.94.0/connector.go:107	Building spanmetrics connector	{"kind": "connector", "name": "spanmetrics", "exporter_in_pipeline": "traces", "receiver_in_pipeline": "metrics"}
2024-02-20T13:35:16.747Z	info	receiver@v0.94.1/receiver.go:296	Development component. May change in the future.	{"kind": "receiver", "name": "httpcheck/frontendproxy", "data_type": "metrics"}
2024-02-20T13:35:16.747Z	info	exporter@v0.94.1/exporter.go:275	Development component. May change in the future.	{"kind": "exporter", "data_type": "logs", "name": "opensearch"}
2024-02-20T13:35:16.747Z	info	exporter@v0.94.1/exporter.go:275	Development component. May change in the future.	{"kind": "exporter", "data_type": "logs", "name": "debug"}
2024-02-20T13:35:16.747Z	warn	filesystemscraper/factory.go:60	No `root_path` config set when running in docker environment, will report container filesystem stats. See https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/hostmetricsreceiver#collecting-host-metrics-from-inside-a-container-linux-only	{"kind": "receiver", "name": "hostmetrics", "data_type": "metrics"}
2024-02-20T13:35:16.748Z	info	service@v0.94.1/service.go:140	Starting otelcol-contrib...	{"Version": "0.94.0", "NumCPU": 10}
2024-02-20T13:35:16.748Z	info	extensions/extensions.go:34	Starting extensions...
2024-02-20T13:35:16.748Z	info	internal/resourcedetection.go:125	began detecting resource information	{"kind": "processor", "name": "resourcedetection", "pipeline": "metrics"}
2024-02-20T13:35:16.749Z	info	system/system.go:201	This attribute changed from int to string. Temporarily switch back to int using the feature gate.	{"kind": "processor", "name": "resourcedetection", "pipeline": "metrics", "attribute": "host.cpu.family", "feature gate": "processor.resourcedetection.hostCPUModelAndFamilyAsString"}
2024-02-20T13:35:16.749Z	info	system/system.go:220	This attribute changed from int to string. Temporarily switch back to int using the feature gate.	{"kind": "processor", "name": "resourcedetection", "pipeline": "metrics", "attribute": "host.cpu.model.id", "feature gate": "processor.resourcedetection.hostCPUModelAndFamilyAsString"}
2024-02-20T13:35:16.749Z	info	internal/resourcedetection.go:139	detected resource information	{"kind": "processor", "name": "resourcedetection", "pipeline": "metrics", "resource": {"host.arch":"arm64","host.cpu.cache.l2.size":0,"host.cpu.family":"","host.cpu.model.id":"0x000","host.cpu.model.name":"","host.cpu.stepping":0,"host.cpu.vendor.id":"Apple","host.name":"e8771a4c7f9b","os.description":"Linux e8771a4c7f9b 6.5.11-linuxkit #1 SMP PREEMPT Wed Dec  6 17:08:31 UTC 2023 aarch64","os.type":"linux"}}
2024-02-20T13:35:16.749Z	info	spanmetricsconnector@v0.94.0/connector.go:189	Starting spanmetrics connector	{"kind": "connector", "name": "spanmetrics", "exporter_in_pipeline": "traces", "receiver_in_pipeline": "metrics"}
2024-02-20T13:35:16.749Z	warn	internal@v0.94.1/warning.go:42	Using the 0.0.0.0 address exposes this server to every network interface, which may facilitate Denial of Service attacks. Enable the feature gate to change the default and remove this warning.	{"kind": "receiver", "name": "otlp", "data_type": "metrics", "documentation": "https://github.com/open-telemetry/opentelemetry-collector/blob/main/docs/security-best-practices.md#safeguards-against-denial-of-service-attacks", "feature gate ID": "component.UseLocalHostAsDefaultHost"}
2024-02-20T13:35:16.749Z	info	otlpreceiver@v0.94.1/otlp.go:102	Starting GRPC server	{"kind": "receiver", "name": "otlp", "data_type": "metrics", "endpoint": "0.0.0.0:4317"}
2024-02-20T13:35:16.749Z	warn	internal@v0.94.1/warning.go:42	Using the 0.0.0.0 address exposes this server to every network interface, which may facilitate Denial of Service attacks. Enable the feature gate to change the default and remove this warning.	{"kind": "receiver", "name": "otlp", "data_type": "metrics", "documentation": "https://github.com/open-telemetry/opentelemetry-collector/blob/main/docs/security-best-practices.md#safeguards-against-denial-of-service-attacks", "feature gate ID": "component.UseLocalHostAsDefaultHost"}
2024-02-20T13:35:16.750Z	info	otlpreceiver@v0.94.1/otlp.go:152	Starting HTTP server	{"kind": "receiver", "name": "otlp", "data_type": "metrics", "endpoint": "0.0.0.0:4318"}
2024-02-20T13:35:16.750Z	info	service@v0.94.1/service.go:166	Everything is ready. Begin running and processing data.
2024-02-20T13:35:16.750Z	warn	localhostgate/featuregate.go:63	The default endpoints for all servers in components will change to use localhost instead of 0.0.0.0 in a future version. Use the feature gate to preview the new default.	{"feature gate ID": "component.UseLocalHostAsDefaultHost"}
2024-02-20T13:35:17.557Z	error	opensearchexporter@v0.94.0/logger.go:36	Request failed.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "path": "/_bulk", "method": "POST", "duration": 0.001637917, "reason": "dial tcp 172.18.0.5:9200: connect: connection refused"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter.(*clientLogger).LogRoundTrip
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter@v0.94.0/logger.go:36
github.com/opensearch-project/opensearch-go/v2/opensearchtransport.(*Client).logRoundTrip
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchtransport/opensearchtransport.go:498
github.com/opensearch-project/opensearch-go/v2/opensearchtransport.(*Client).Perform
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchtransport/opensearchtransport.go:337
github.com/opensearch-project/opensearch-go/v2.(*Client).Perform
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearch.go:267
github.com/opensearch-project/opensearch-go/v2/opensearchapi.BulkRequest.Do
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchapi/api.bulk.go:192
github.com/opensearch-project/opensearch-go/v2/opensearchutil.(*worker).flush
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchutil/bulk_indexer.go:552
github.com/opensearch-project/opensearch-go/v2/opensearchutil.(*bulkIndexer).Close
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchutil/bulk_indexer.go:299
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter.(*logBulkIndexer).close
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter@v0.94.0/log_bulk_indexer.go:41
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter.(*logExporter).pushLogData
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter@v0.94.0/sso_log_exporter.go:73
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:59
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/timeout_sender.go:43
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/retry_sender.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:171
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:35
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseExporter).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:199
go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsExporter.func1
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:99
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/internal/fanoutconsumer.(*logsConsumer).ConsumeLogs
	go.opentelemetry.io/collector@v0.94.1/internal/fanoutconsumer/logs.go:64
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/batchprocessor.(*batchLogs).export
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:489
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).sendItems
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:256
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).start
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:218
2024-02-20T13:35:17.557Z	error	exporterhelper/common.go:201	Exporting failed. Rejecting data.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "error": "not retryable error: Permanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused\nPermanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused", "rejected_items": 1}
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseExporter).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:201
go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsExporter.func1
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:99
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/internal/fanoutconsumer.(*logsConsumer).ConsumeLogs
	go.opentelemetry.io/collector@v0.94.1/internal/fanoutconsumer/logs.go:64
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/batchprocessor.(*batchLogs).export
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:489
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).sendItems
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:256
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).start
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:218
2024-02-20T13:35:17.557Z	info	LogsExporter	{"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 1}
2024-02-20T13:35:17.557Z	warn	batchprocessor@v0.94.1/batch_processor.go:258	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "not retryable error: Permanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused\nPermanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused"}
1708436117562097304 [Debug] Error fetching info for pid 1: %!w(*fs.PathError=&{open /etc/passwd 2})
2024-02-20T13:35:17.953Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 9, "metrics": 29, "data points": 216}
2024-02-20T13:35:22.189Z	error	opensearchexporter@v0.94.0/logger.go:36	Request failed.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "path": "/_bulk", "method": "POST", "duration": 0.000524833, "reason": "dial tcp 172.18.0.5:9200: connect: connection refused"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter.(*clientLogger).LogRoundTrip
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter@v0.94.0/logger.go:36
github.com/opensearch-project/opensearch-go/v2/opensearchtransport.(*Client).logRoundTrip
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchtransport/opensearchtransport.go:498
github.com/opensearch-project/opensearch-go/v2/opensearchtransport.(*Client).Perform
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchtransport/opensearchtransport.go:337
github.com/opensearch-project/opensearch-go/v2.(*Client).Perform
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearch.go:267
github.com/opensearch-project/opensearch-go/v2/opensearchapi.BulkRequest.Do
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchapi/api.bulk.go:192
github.com/opensearch-project/opensearch-go/v2/opensearchutil.(*worker).flush
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchutil/bulk_indexer.go:552
github.com/opensearch-project/opensearch-go/v2/opensearchutil.(*bulkIndexer).Close
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchutil/bulk_indexer.go:299
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter.(*logBulkIndexer).close
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter@v0.94.0/log_bulk_indexer.go:41
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter.(*logExporter).pushLogData
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter@v0.94.0/sso_log_exporter.go:73
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:59
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/timeout_sender.go:43
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/retry_sender.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:171
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:35
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseExporter).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:199
go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsExporter.func1
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:99
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/internal/fanoutconsumer.(*logsConsumer).ConsumeLogs
	go.opentelemetry.io/collector@v0.94.1/internal/fanoutconsumer/logs.go:64
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/batchprocessor.(*batchLogs).export
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:489
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).sendItems
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:256
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).start
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:218
2024-02-20T13:35:22.189Z	error	exporterhelper/common.go:201	Exporting failed. Rejecting data.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "error": "not retryable error: Permanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused\nPermanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused", "rejected_items": 2}
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseExporter).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:201
go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsExporter.func1
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:99
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/internal/fanoutconsumer.(*logsConsumer).ConsumeLogs
	go.opentelemetry.io/collector@v0.94.1/internal/fanoutconsumer/logs.go:64
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/batchprocessor.(*batchLogs).export
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:489
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).sendItems
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:256
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).start
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:218
2024-02-20T13:35:22.189Z	info	LogsExporter	{"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 2}
2024-02-20T13:35:22.189Z	warn	batchprocessor@v0.94.1/batch_processor.go:258	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "not retryable error: Permanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused\nPermanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused"}
2024-02-20T13:35:22.990Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 1, "metrics": 3, "data points": 3}
2024-02-20T13:35:22.992Z	error	opensearchexporter@v0.94.0/logger.go:36	Request failed.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "path": "/_bulk", "method": "POST", "duration": 0.000430167, "reason": "dial tcp 172.18.0.5:9200: connect: connection refused"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter.(*clientLogger).LogRoundTrip
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter@v0.94.0/logger.go:36
github.com/opensearch-project/opensearch-go/v2/opensearchtransport.(*Client).logRoundTrip
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchtransport/opensearchtransport.go:498
github.com/opensearch-project/opensearch-go/v2/opensearchtransport.(*Client).Perform
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchtransport/opensearchtransport.go:337
github.com/opensearch-project/opensearch-go/v2.(*Client).Perform
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearch.go:267
github.com/opensearch-project/opensearch-go/v2/opensearchapi.BulkRequest.Do
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchapi/api.bulk.go:192
github.com/opensearch-project/opensearch-go/v2/opensearchutil.(*worker).flush
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchutil/bulk_indexer.go:552
github.com/opensearch-project/opensearch-go/v2/opensearchutil.(*bulkIndexer).Close
	github.com/opensearch-project/opensearch-go/v2@v2.3.0/opensearchutil/bulk_indexer.go:299
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter.(*logBulkIndexer).close
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter@v0.94.0/log_bulk_indexer.go:41
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter.(*logExporter).pushLogData
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/opensearchexporter@v0.94.0/sso_log_exporter.go:73
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:59
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/timeout_sender.go:43
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/retry_sender.go:89
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:171
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:35
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseExporter).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:199
go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsExporter.func1
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:99
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/internal/fanoutconsumer.(*logsConsumer).ConsumeLogs
	go.opentelemetry.io/collector@v0.94.1/internal/fanoutconsumer/logs.go:64
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/batchprocessor.(*batchLogs).export
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:489
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).sendItems
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:256
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).start
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:218
2024-02-20T13:35:22.992Z	error	exporterhelper/common.go:201	Exporting failed. Rejecting data.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "error": "not retryable error: Permanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused\nPermanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused", "rejected_items": 6}
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseExporter).send
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/common.go:201
go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsExporter.func1
	go.opentelemetry.io/collector/exporter@v0.94.1/exporterhelper/logs.go:99
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/internal/fanoutconsumer.(*logsConsumer).ConsumeLogs
	go.opentelemetry.io/collector@v0.94.1/internal/fanoutconsumer/logs.go:64
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1
	go.opentelemetry.io/collector/processor@v0.94.1/processorhelper/logs.go:60
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs
	go.opentelemetry.io/collector/consumer@v0.94.1/logs.go:25
go.opentelemetry.io/collector/processor/batchprocessor.(*batchLogs).export
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:489
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).sendItems
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:256
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).start
	go.opentelemetry.io/collector/processor/batchprocessor@v0.94.1/batch_processor.go:218
2024-02-20T13:35:22.992Z	info	LogsExporter	{"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 6}
2024-02-20T13:35:22.992Z	warn	batchprocessor@v0.94.1/batch_processor.go:258	Sender failed	{"kind": "processor", "name": "batch", "pipeline": "logs", "error": "not retryable error: Permanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused\nPermanent error: Permanent error: flush: dial tcp 172.18.0.5:9200: connect: connection refused"}
2024-02-20T13:35:26.092Z	info	exporterhelper/retry_sender.go:118	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "error": "{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}\n{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}", "interval": "6.835785263s"}
2024-02-20T13:35:27.811Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 8, "metrics": 27, "data points": 290}
2024-02-20T13:35:32.932Z	info	exporterhelper/retry_sender.go:118	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "error": "{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}", "interval": "4.460586744s"}
2024-02-20T13:35:34.225Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 3, "spans": 15}
2024-02-20T13:35:34.426Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 14, "spans": 557}
2024-02-20T13:35:35.639Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:35:35.840Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 6, "spans": 30}
2024-02-20T13:35:37.397Z	info	exporterhelper/retry_sender.go:118	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "error": "{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}", "interval": "16.402330801s"}
2024-02-20T13:35:37.450Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 6, "spans": 50}
2024-02-20T13:35:37.650Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 13, "spans": 77}
2024-02-20T13:35:37.861Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 8, "metrics": 27, "data points": 290}
2024-02-20T13:35:38.453Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 4, "spans": 66}
2024-02-20T13:35:38.655Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:35:39.458Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 5, "spans": 527}
2024-02-20T13:35:39.659Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 7, "spans": 35}
2024-02-20T13:35:44.676Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:35:45.078Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:35:45.680Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 5, "spans": 25}
2024-02-20T13:35:45.880Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 14, "spans": 66}
2024-02-20T13:35:46.893Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 11, "metrics": 22, "data points": 80}
2024-02-20T13:35:47.085Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 7}
2024-02-20T13:35:47.286Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 435}
2024-02-20T13:35:47.896Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 8, "metrics": 27, "data points": 289}
2024-02-20T13:35:48.089Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 14}
2024-02-20T13:35:48.290Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 41}
2024-02-20T13:35:49.696Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:35:49.897Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:35:53.806Z	info	exporterhelper/retry_sender.go:118	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "error": "{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}", "interval": "12.689910284s"}
2024-02-20T13:35:55.921Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-02-20T13:35:56.122Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 10}
2024-02-20T13:35:57.125Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 7}
2024-02-20T13:35:57.325Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 14}
2024-02-20T13:35:57.927Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 8, "metrics": 27, "data points": 290}
2024-02-20T13:35:58.329Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 20}
2024-02-20T13:35:58.529Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 6}
2024-02-20T13:36:01.940Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 12, "metrics": 24, "data points": 100}
2024-02-20T13:36:02.341Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-02-20T13:36:02.743Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 4, "spans": 10}
2024-02-20T13:36:03.346Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 9}
2024-02-20T13:36:04.751Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 19}
2024-02-20T13:36:05.954Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 6}
2024-02-20T13:36:06.502Z	info	exporterhelper/retry_sender.go:118	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "error": "{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}", "interval": "36.487756973s"}
2024-02-20T13:36:06.756Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-02-20T13:36:07.358Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 7}
2024-02-20T13:36:07.759Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 6, "spans": 15}
2024-02-20T13:36:07.960Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 8, "metrics": 27, "data points": 290}
2024-02-20T13:36:08.361Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 11}
2024-02-20T13:36:08.964Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:36:09.767Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 25}
2024-02-20T13:36:10.973Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 124}
2024-02-20T13:36:11.776Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-02-20T13:36:12.377Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 15}
2024-02-20T13:36:15.990Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 20}
2024-02-20T13:36:16.793Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 13, "metrics": 26, "data points": 108}
2024-02-20T13:36:16.994Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 3, "spans": 15}
2024-02-20T13:36:17.194Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-02-20T13:36:17.395Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 1, "metrics": 24, "data points": 29}
2024-02-20T13:36:17.797Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 9, "metrics": 29, "data points": 296}
2024-02-20T13:36:17.998Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-02-20T13:36:18.398Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 1, "metrics": 25, "data points": 30}
2024-02-20T13:36:18.400Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 35}
2024-02-20T13:36:19.201Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-02-20T13:36:19.201Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 1, "metrics": 21, "data points": 52}
2024-02-20T13:36:19.402Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:36:21.612Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 42}
2024-02-20T13:36:22.214Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-02-20T13:36:22.616Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 6, "spans": 17}
2024-02-20T13:36:23.619Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 1, "metrics": 29, "data points": 61}
2024-02-20T13:36:26.025Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-02-20T13:36:27.030Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 23}
2024-02-20T13:36:27.431Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-02-20T13:36:27.630Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 1, "metrics": 22, "data points": 306}
2024-02-20T13:36:27.831Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 8, "metrics": 27, "data points": 290}
2024-02-20T13:36:28.034Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-02-20T13:36:28.435Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 12}
2024-02-20T13:36:29.237Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-02-20T13:36:30.040Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:36:30.844Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-02-20T13:36:31.846Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 13, "metrics": 26, "data points": 108}
2024-02-20T13:36:32.047Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 1, "metrics": 20, "data points": 26}
2024-02-20T13:36:32.247Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:36:32.448Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-02-20T13:36:33.050Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 1, "metrics": 105, "data points": 153}
2024-02-20T13:36:33.252Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 1, "metrics": 2, "data points": 22}
2024-02-20T13:36:36.662Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-02-20T13:36:37.263Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 11}
2024-02-20T13:36:37.665Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 6, "spans": 22}
2024-02-20T13:36:37.866Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 8, "metrics": 27, "data points": 290}
2024-02-20T13:36:38.468Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 9}
2024-02-20T13:36:38.670Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 23}
2024-02-20T13:36:41.080Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 10, "spans": 46}
2024-02-20T13:36:41.281Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:36:42.285Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 15}
2024-02-20T13:36:42.687Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 7, "spans": 19}
2024-02-20T13:36:43.038Z	info	exporterhelper/retry_sender.go:118	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "error": "{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}", "interval": "25.678350787s"}
2024-02-20T13:36:43.289Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 9, "spans": 46}
2024-02-20T13:36:46.702Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 6, "spans": 30}
2024-02-20T13:36:46.893Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 13, "metrics": 26, "data points": 108}
2024-02-20T13:36:46.903Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 4, "spans": 20}
2024-02-20T13:36:47.304Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 51}
2024-02-20T13:36:47.896Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 8, "metrics": 27, "data points": 290}
2024-02-20T13:36:48.107Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 14}
2024-02-20T13:36:48.308Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 14}
2024-02-20T13:36:49.112Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 6, "spans": 30}
2024-02-20T13:36:49.714Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 225}
2024-02-20T13:36:51.922Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:36:52.324Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 19}
2024-02-20T13:36:53.128Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 8}
2024-02-20T13:36:56.952Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 29}
2024-02-20T13:36:57.368Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 8}
2024-02-20T13:36:57.770Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 8, "metrics": 27, "data points": 290}
2024-02-20T13:36:57.971Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 7}
2024-02-20T13:36:58.374Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 6}
2024-02-20T13:37:00.180Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 5}
2024-02-20T13:37:01.787Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 13, "metrics": 26, "data points": 108}
2024-02-20T13:37:02.186Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 32}
2024-02-20T13:37:02.387Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 11}
2024-02-20T13:37:03.392Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 3}
2024-02-20T13:37:03.594Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 7}
2024-02-20T13:37:06.004Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-02-20T13:37:07.208Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 22}
2024-02-20T13:37:07.409Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 11}
2024-02-20T13:37:07.811Z	info	MetricsExporter	{"kind": "exporter", "data_type": "metrics", "name": "debug", "resource metrics": 8, "metrics": 27, "data points": 290}
2024-02-20T13:37:08.413Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 2, "spans": 9}
2024-02-20T13:37:08.758Z	info	exporterhelper/retry_sender.go:118	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "opensearch", "error": "{\"type\":\"cluster_block_exception\",\"reason\":\"index [otel] blocked by: [TOO_MANY_REQUESTS/12/disk usage exceeded flood-stage watermark, index has read-only-allow-delete block];\",\"caused_by\":{\"type\":\"\",\"reason\":\"\",\"caused_by\":null}}", "interval": "26.169887369s"}

Additional context

No response

@julianocosta89 julianocosta89 added bug Something isn't working needs triage New item requiring triage labels Feb 20, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@crobert-1
Copy link
Member

Hello @julianocosta89, this is an error with your OpenSearch backend, not the collector, from my understanding. I don't have much context here, but it looks like this error message is saying your backend is out of space so you can't write any more data to it. Please free up some space in your cluster and try again.

OpenSearch cluster settings doc: https://opensearch.org/docs/1.3/api-reference/cluster-api/cluster-settings/
This has references to the watermark and read-only-allow-delete configuration option.

@crobert-1 crobert-1 added question Further information is requested and removed bug Something isn't working needs triage New item requiring triage labels Mar 5, 2024
@julianocosta89
Copy link
Member Author

Thank you @crobert-1.
After increasing the memory the service stopped failing.

We are good to close this one.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
exporter/opensearch question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants