Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[exporter/elasticsearchexporter] transaction duration missing in traces received in Elasticsearch using elasticsearchexporter #14538

Closed
ramdaspotale opened this issue Sep 27, 2022 · 4 comments

Comments

@ramdaspotale
Copy link

ramdaspotale commented Sep 27, 2022

What happened?

Description

I am using the latest open telemetry collector with Elasticsearch exporter to export traces to Elasticsearch instance. I previously used otlphttp exporter to export Traces to Elastic APM and consequently to Elasticsearch. although Elasticsearch exporter is able to push Traces to ES, the exported docs do not have "transaction.duration" or trace duration for the time it was running on the application, it was there in the docs received via Elastic APM. below is the collector config I am using and the sample trace received in Elasticsearch via Elastic APM(otlphttp exporter) and directly to ES(elasticsearch exporter).

Steps to Reproduce

Run opentelemetry collector 0.60.0 with configuration given and send traces to Elasticsearch instance using elasticsearch exporter.

Expected Result

I should get traces with transaction duration or trace duration data which is missing. below can be considered as expected results which has transaction.duration.us data.

Trace exported via otlphttp exporter to elastic APM to Elasticsearch:
"_index": ".ds-traces-apm-default-2022.08.29-000001",
"_id": "vTdLfoMBWrBaGggf-z_9",
"_version": 1,
"_score": 0,
"_source": {
"agent": {
"name": "otlp",
"version": "unknown"
},
"data_stream.namespace": "default",
"data_stream.type": "traces",
"processor": {
"name": "transaction",
"event": "transaction"
},
"labels": {
"bmap_plan_name": "hj-otel3",
"session": "0x34d0f2f6",
"datasource": "pwdsn_db0",
"dbAction": "ExecuteSelect",
"bmap_plan_id": "63075731126d24faff9b56ca",
"sqlCmd": "select INDEXDOCINFOID,IMPORT from EC_INDEXDOCINFO where (DOC_GUID not in (select o_docguid from dms_doc)) order by IMPORT"
},
"observer": {
"hostname": "elastic-apm-844b55bb54-8wvrt",
"id": "d3a9b4fe-2d05-4a1d-9644-ac40f8529a36",
"ephemeral_id": "8ff0ddec-ee11-4f03-9a1b-3f3c18cf425f",
"type": "apm-server",
"version": "8.3.3"
},
"trace": {
"id": "44b3c2c06e15d444a770b87daab45c0a"
},
"@timestamp": "2022-09-27T09:34:05.445Z",
"ecs": {
"version": "1.12.0"
},
"service": {
"node": {
"name": "7c7ab9aa-341b-4283-ba48-66e20ce65174"
},
"framework": {
"name": "PWDI"
},
"name": "PWDI Server",
"language": {
"name": "unknown"
}
},
"data_stream.dataset": "apm",
"event": {
"agent_id_status": "missing",
"ingested": "2022-09-27T09:34:12Z",
"outcome": "unknown"
},
"transaction": {
"duration": {
"us": 9167
},
"name": "PwDbOperation",
"id": "c69db0111713fa42",
"type": "unknown",
"sampled": true
},
"timestamp": {
"us": 1664271245445453
}
}

Actual Result

This is the trace I receive when exported via Elasticsearch exporter directly to Elasticsearch instance:
"_index": "traces.projectwise",
"_id": "rzdMfoMBWrBaGggfHUra",
"_version": 1,
"_score": 0,
"_source": {
"@timestamp": "2022-09-27T09:34:05.445453700Z",
"Attributes.bmap.plan_id": "63075731126d24faff9b56ca",
"Attributes.bmap.plan_name": "hj-otel3",
"Attributes.datasource": "pwdsn_db0",
"Attributes.dbAction": "ExecuteSelect",
"Attributes.session": "0x34d0f2f6",
"Attributes.sqlCmd": "select INDEXDOCINFOID,IMPORT from EC_INDEXDOCINFO where (DOC_GUID not in (select o_docguid from dms_doc)) order by IMPORT",
"EndTimestamp": "2022-09-27T09:34:05.454621600Z",
"Kind": "SPAN_KIND_SERVER",
"Link": "[]",
"Name": "PwDbOperation",
"Resource.service.instance.id": "7c7ab9aa-341b-4283-ba48-66e20ce65174",
"Resource.service.name": "PWDI Server",
"SpanId": "c69db0111713fa42",
"TraceId": "44b3c2c06e15d444a770b87daab45c0a",
"TraceStatus": 0
}

Collector version

0.60.0

Environment information

Environment

Collector image: docker.io/otel/opentelemetry-collector-contrib:0.60.0
Elasticsearch: 8.3.3
Elastic APM: 8.3.3
Platform: Azure Kubernetes Service

OpenTelemetry Collector configuration

receivers:
      otlp:
        protocols:
          grpc:
            endpoint: 0.0.0.0:4317
          http:
            endpoint: 0.0.0.0:4318
    processors:
      memory_limiter:
        check_interval: 1s
        limit_percentage: 50
        spike_limit_percentage: 30
      batch:
        timeout: 10s
        send_batch_size: 10000
        send_batch_max_size: 11000
      attributes:
        actions:
          - key: bmap.plan_id
            value: 63075731126d24faff9b56ca
            action: insert
          - key: bmap.plan_name
            value: hj-otel3
            action: insert
    extensions:
      health_check:
      memory_ballast:
        size_in_percentage: 30
      pprof:
        endpoint: :1888
      zpages:
        endpoint: :55679
    exporters:
      logging:
        loglevel: debug
      otlphttp/insecure_no_verify:
        endpoint: https://apm.<REDACTED>.com:8200
        compression: none
        tls:
          insecure: false
          insecure_skip_verify: true
      elasticsearch/trace:
        endpoints: [https://elastic.<REDACTED>.com:9200]
        traces_index: traces.projectwise
        api_key: <REDACTED>
        tls:
          insecure: false
          insecure_skip_verify: true
    service:
      extensions: [pprof, zpages, health_check, memory_ballast]
      pipelines:
        traces:
          receivers: [otlp]
          processors: [memory_limiter,batch,attributes]
          exporters: [logging,otlphttp/insecure_no_verify,elasticsearch/trace]

Log output

No response

Additional context

No response

@ramdaspotale ramdaspotale added bug Something isn't working needs triage New item requiring triage labels Sep 27, 2022
@jpkrohling jpkrohling added exporter/elasticsearch and removed needs triage New item requiring triage labels Sep 27, 2022
@github-actions
Copy link
Contributor

Pinging code owners: @urso @faec @blakerouse. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@JaredTan95
Copy link
Member

@ramdaspotale As far as I know, otlp span does not contains transaction. duration attributes. In your case, elastic apm received otlp tracing and calculated duration from the start and end time.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Nov 30, 2022
@github-actions
Copy link
Contributor

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale May 26, 2023
mx-psi pushed a commit that referenced this issue Jul 5, 2023
**Description:**
#14538
<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue.
Ex. Adding a feature - Explain what this achieves.-->

**Link to tracking Issue:** <Issue number if applicable>

**Testing:** <Describe what testing was performed and which tests were
added.>

**Documentation:** <Describe the documentation added.>

---------

Signed-off-by: Jared Tan <jian.tan@daocloud.io>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants