Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

headers_setter not setting header value from_context #29676

Closed
gunjan-chauhan-dev opened this issue Dec 6, 2023 · 5 comments
Closed

headers_setter not setting header value from_context #29676

gunjan-chauhan-dev opened this issue Dec 6, 2023 · 5 comments

Comments

@gunjan-chauhan-dev
Copy link

gunjan-chauhan-dev commented Dec 6, 2023

Component(s)

extension/headerssetter

What happened?

Description

I am looking to setup open telemetry collector in gateway mode, where in each tenant is responsible for sending the tenant in for in "X-Scope-Orgid" header. The setup uses headers_setter extension to populate the header value in the otlp exporter.

Steps to Reproduce

save zipkin format trace sample in traces.json

[
  {
    "traceId": "5982fe77008310cc80f1da5e10147519",
    "parentId": "90394f6bcffb5d13",
    "id": "67fae42571535f60",
    "kind": "SERVER",
    "name": "/m/n/2.6.1",
    "timestamp": 1516781775726000,
    "duration": 26000,
    "localEndpoint": {
      "serviceName": "api"
    },
    "remoteEndpoint": {
      "serviceName": "apip"
    },
    "tags": {
      "data.http_response_code": "201"
    }
  }
]

Via curl send above trace to local otel collector:
curl -X POST localhost:9411/api/v2/spans -H'Content-Type: application/json' -H 'X-Scope-Orgid: <my-tenant-id>' -d @trace.json

Expected Result

Authentication with the traces backend succeeds and traces are propagated.

Actual Result

2023-12-06T19:04:25.134Z error exporterhelper/retry_sender.go:145 Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "data_type": "traces", "name": "otlp/cloudoi", "error": "Permanent error: rpc error: code = Unknown desc = authentication token missing", "dropped_items": 1}

go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send

go.opentelemetry.io/collector/exporter@v0.86.0/exporterhelper/retry_sender.go:145

go.opentelemetry.io/collector/exporter/exporterhelper.(*tracesExporterWithObservability).send

go.opentelemetry.io/collector/exporter@v0.86.0/exporterhelper/traces.go:177

go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).start.func1

go.opentelemetry.io/collector/exporter@v0.86.0/exporterhelper/queue_sender.go:124

go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1

go.opentelemetry.io/collector/exporter@v0.86.0/exporterhelper/internal/bounded_memory_queue.go:52

Collector version

v0.86.0

Environment information

Environment

OS: macos 13.6.1
running docker image locally otel/opentelemetry-collector-contrib:0.86.0

OpenTelemetry Collector configuration

receivers:
  zipkin:
    include_metadata: true
exporters:
  otlp/cloud:
    endpoint: <cloud-otel-col-endpoint>:443
    auth: 
      authenticator: headers_setter/cloud
processors:
  attributes:
    actions:
     - key: "traces_orgid"
       action: insert
       from_context: "X-Scope-Orgid"
  filter/ctd:
    traces:
      span:
        - 'resource.attributes["service.name"] == "frontend-proxy"'
  batch:

extensions:
  headers_setter/cloud:
    headers:
      - action: insert
        key: "X-Scope-Orgid"
        from_context: "X-Scope-Orgid"

service:
  extensions: [ headers_setter/cloud ]
  pipelines:
    traces/ctd:
      receivers: [zipkin ]
      processors: [ attributes, filter/ctd, batch ]
      exporters: [ otlp/cloud  ]

Log output

2023-12-06T19:04:25.134Z error exporterhelper/retry_sender.go:145 Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "data_type": "traces", "name": "otlp/cloudoi", "error": "Permanent error: rpc error: code = Unknown desc = authentication token missing", "dropped_items": 1}

go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send

go.opentelemetry.io/collector/exporter@v0.86.0/exporterhelper/retry_sender.go:145

go.opentelemetry.io/collector/exporter/exporterhelper.(*tracesExporterWithObservability).send

go.opentelemetry.io/collector/exporter@v0.86.0/exporterhelper/traces.go:177

go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).start.func1

go.opentelemetry.io/collector/exporter@v0.86.0/exporterhelper/queue_sender.go:124

go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).Start.func1

go.opentelemetry.io/collector/exporter@v0.86.0/exporterhelper/internal/bounded_memory_queue.go:52

Additional context

No response

@gunjan-chauhan-dev gunjan-chauhan-dev added bug Something isn't working needs triage New item requiring triage labels Dec 6, 2023
Copy link
Contributor

github-actions bot commented Dec 6, 2023

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@atoulme
Copy link
Contributor

atoulme commented Dec 6, 2023

You are setting a span attribute based on context on the span, but you try to read this back from the context, not the span:

extensions:
  headers_setter/cloud:
    headers:
      - action: insert
        key: "X-Scope-Orgid"
        from_context: "X-Scope-Orgid"

The context is lost because of the batch processor being present in the pipeline.

Try this:

receivers:
  zipkin:
    include_metadata: true
exporters:
  otlp/cloud:
    endpoint: <cloud-otel-col-endpoint>:443
    auth: 
      authenticator: headers_setter/cloud
processors:
  filter/ctd:
    traces:
      span:
        - 'resource.attributes["service.name"] == "frontend-proxy"'
  batch:

extensions:
  headers_setter/cloud:
    headers:
      - action: insert
        key: "X-Scope-Orgid"
        from_context: "X-Scope-Orgid"

service:
  extensions: [ headers_setter/cloud ]
  pipelines:
    traces/ctd:
      receivers: [zipkin ]
      processors: [ filter/ctd ]
      exporters: [ otlp/cloud  ]

@atoulme atoulme added waiting for author and removed needs triage New item requiring triage labels Dec 6, 2023
@gunjan-chauhan-dev
Copy link
Author

gunjan-chauhan-dev commented Dec 7, 2023

Thanks, this works as expected after removing batch processor.. Is there any plan that this also starts to work with batch processor as well in future?
Closing this issue.

@crobert-1
Copy link
Member

Hello @gunjan-chauhan-dev, there are some configuration options available in the batch processor to batch by metadata, which will allow metadata to be kept with the data. Here's the section in the README. The keys specified in metadata_keys will be included in the metadata of telemetry created by the batch processor.

@gunjan-chauhan-dev
Copy link
Author

Thanks @crobert-1, this is helpful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants