Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: High number of operations after update #1938

Open
julianocosta89 opened this issue Jul 16, 2024 · 5 comments
Open

[Bug]: High number of operations after update #1938

julianocosta89 opened this issue Jul 16, 2024 · 5 comments
Assignees

Comments

@julianocosta89
Copy link
Member

What happened?

Hello all 👋🏽

I was updating the OTel Demo to v0.23 (tracing-opentelemetry is still on 0.23) and I've noticed that I started getting a bunch of new spans from the Rust service.

Before the update I had 3 operations:

  • oteldemo.ShippingService/ShipOrder
  • oteldemo.ShippingService/GetQuote
  • POST

After the update I got a total of 24 operations:

  • assign_connection_capacity
  • flush
  • FramedRead::decode_frame
  • FramedRead::poll_next
  • FramedWrite::buffer
  • FramedWrite::flush
  • hpack::decode
  • hpack::encode
  • poll
  • poll_ready
  • pop_frame
  • popped
  • Prioritize::queue_frame
  • read_preface
  • recv_stream_window_update
  • reserve_capacity
  • send_data
  • try_assign_capacity
  • try_reclaim_frame
  • updating connection flow
  • updating stream flow

All those new operations seem to be coming from /usr/local/cargo/registry/src/index.crates.io-*/src/proto/streams/prioritize.rs.

Is that an expected behavior?
Can we suppress those spans?

Also the number of events per span increased significantly.

If you would like to take a look at the code, here is the PR: open-telemetry/opentelemetry-demo#1672, to run it execute the following commands from the root folder:

docker compose build shippingservice
docker compose up

API Version

0.23

SDK Version

0.23

What Exporter(s) are you seeing the problem on?

OTLP

Relevant log output

No response

@julianocosta89 julianocosta89 added bug Something isn't working triage:todo Needs to be traiged. labels Jul 16, 2024
@julianocosta89 julianocosta89 changed the title [Bug]: [Bug]: High number of operations after update Jul 16, 2024
@cijothomas cijothomas self-assigned this Jul 16, 2024
@cijothomas
Copy link
Member

@julianocosta89 Thanks for reporting this. Unfortunately, the combination of using tracing for creating spans, and then bridging it to OTel via tracing-opentelemetry is not tested in this repo. It is part of the parent issue tracked via #1571

most likely, you can use Filters from tracing itself, to suppress these extra spans from ever being created and sent to OTel. Sorry we don't have a better answer when it comes to interaction between tracing, tracing-opentelemetry and opentelemetry, as the parent issue is unresolved.

@julianocosta89
Copy link
Member Author

hmmm 😢

@cijothomas is there any instrumentation library that is more connected to the OTel SIG?
On the demo we use reqwest to do a HTTP call to a downward service, but I could replace it with a library that is more involved with OTel.

In that case we continue to showcase manual instrumentation with the server side, and instrumentation libraries with the client side. WDYT?

@cijothomas
Copy link
Member

is there any instrumentation library that is more connected to the OTel SIG?

Neither OTel Rust main repo (this) nor OTel Rust contrib repo has any instrumentation libraries! The ones used by demo project are maintained outside OTel orgs, but more importantly, they are instrumented using tokio-tracing, not OpenTelemetry. The right fix for that depends on #1571 itself.

Examples in this repo only show manual instrumentation with OTel's own tracing API. (Although, it is possible that this will be considered wrong/deprecated, depending on the outcome of #1571)
https://github.com/open-telemetry/opentelemetry-rust/tree/main/examples

It is an unfortunate situation, and we are aware of it. @TommyCpp to share any additional recommendations.

@julianocosta89
Copy link
Member Author

Is there any way tokio-tracing would merge into OTel?
We saw that competing standards are not beneficial to the community, and having everything under OTel would be great from my user perspective, not sure what others think.

@jtescher is one of the top contributors on the tokio-tracing and I see he is also part of the OTel group, maybe he could share his opinion here as well.

Tokio would continue to be it's own thing, and tracing would be "donated" to the OTel community.

Having OTel's semantic conventions and specifications in multiple languages is one of the greatest achievements and benefits of OTel. I'd love that for Rust as well.

@cijothomas
Copy link
Member

Is there any way tokio-tracing would merge into OTel?

No. Pasting below the answer to that question by the maintainer of tokio-tracing. Extracted from #1689 (comment)

I don't believe so/unsure. To be frank, I don't see a future in which tracing would move under the CNCF: if it's ever to move from the tokio organization, it would be the rust-lang org, and that's a very big if.

@cijothomas cijothomas removed bug Something isn't working triage:todo Needs to be traiged. labels Jul 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants