Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to find complete exported spans in OpenSearch Backend #36136

Open
charan906 opened this issue Oct 30, 2024 · 2 comments
Open

Unable to find complete exported spans in OpenSearch Backend #36136

charan906 opened this issue Oct 30, 2024 · 2 comments
Labels
bug Something isn't working exporter/opensearch needs triage New item requiring triage

Comments

@charan906
Copy link

I have generated 20000 spans , spans are received by Open telemetry collector, but during export when i checked logs of collector pod every times it got struct at particular span number , with thisIi cannot see complete span data in the backend

this is the configuration i am using , with otel_version="v0.107.0" and its a Customized binary its contains both core and contrib repository plugins

configuration:
exporters:
debug:
verbosity: detailed
opensearch:
http:
endpoint: ${env:SE_SERVER_URLS}
tls:
ca_file: ${env:ROOT_CA_CERT}
cert_file: ${env:CLIENT_CRT}
key_file: ${env:CLIENT_KEY}
extensions:
memory_ballast: {}
health_check:
endpoint: ${env:MY_POD_IP}:13133
jaegerremotesampling:
source:
reload_interval: 0s
file: /etc/sampling/samplingstrategies.json
processors:
batch: {}
memory_limiter:
# Check_interval is the time between measurements of memory usage.
check_interval: 5s
# By default limit_mib is set to 80% of ".Values.resources.limits.memory"
limit_percentage: 80
# By default spike_limit_mib is set to 25% of ".Values.resources.limits.memory"
spike_limit_percentage: 25
probabilistic_sampler:
sampling_percentage: 100
receivers:
otlp:
protocols:
grpc:
endpoint: ${env:MY_POD_IP}:4317
tls:
cert_file: ${env:SERVER_CRT}
key_file: ${env:SERVER_KEY}
http:
endpoint: ${env:MY_POD_IP}:4318
tls:
cert_file: ${env:SERVER_CRT}
key_file: ${env:SERVER_KEY}
service:
extensions:
- memory_ballast
- health_check
- jaegerremotesampling
pipelines:
traces:
exporters:
- debug
- opensearch
processors:
- memory_limiter
- batch
- probabilistic_sampler
receivers:
- otlp

Kubernetes resources specifications:

resources:
telemetry-collector:
requests:
memory: 64Mi
cpu: 250m
limits:
memory: 128Mi
cpu: 500m

Server starting logs:

2024-10-30T12:15:48.331Z info memorylimiter/memorylimiter.go:151 Using percentage memory limiter {"kind": "processor", "name": "memory_limiter", "pipeline": "traces", "total_memory_mib": 128, "limit_percentage": 80, "spike_limit_percentage": 25}
2024-10-30T12:15:48.331Z info memorylimiter/memorylimiter.go:75 Memory limiter configured {"kind": "processor", "name": "memory_limiter", "pipeline": "traces", "limit_mib": 102, "spike_limit_mib": 32, "check_interval": 5}
2024-10-30T12:15:48.333Z info service@v0.107.0/service.go:195 Starting otelcol-custom... {"Version": "1.0.0", "NumCPU": 8}

@charan906 charan906 added the bug Something isn't working label Oct 30, 2024
@codeboten
Copy link
Contributor

This appears to possibly be an issue with the opensearch exporter, transferring to the contrib repo where that exporter lives

@codeboten codeboten transferred this issue from open-telemetry/opentelemetry-collector Nov 1, 2024
@codeboten codeboten added needs triage New item requiring triage exporter/opensearch labels Nov 1, 2024
Copy link
Contributor

github-actions bot commented Nov 1, 2024

Pinging code owners for exporter/opensearch: @Aneurysm9 @MitchellGale @MaxKsyunz @YANG-DB. See Adding Labels via Comments if you do not have permissions to add labels yourself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working exporter/opensearch needs triage New item requiring triage
Projects
None yet
Development

No branches or pull requests

2 participants