Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[exporterhelper] Default queue size is too large #7359

Closed
swiatekm opened this issue Mar 13, 2023 · 4 comments
Closed

[exporterhelper] Default queue size is too large #7359

swiatekm opened this issue Mar 13, 2023 · 4 comments

Comments

@swiatekm
Copy link
Contributor

Is your feature request related to a problem? Please describe.
Exporterhelper has a default queue size of 5000. If we use the batch processor with default settings as well, that results in a maximum size of 5000 * 8192 ~= 41 000 000 spans. At a conservative 500 bytes per span, we'd consume ~20 GB of memory, and most likely be killed by either the orchestrator or the OS itself.

This is way too much in my opinion. The default size of this queue should allow for enough buffer space to absorb temporary increases in data volume and to feed the 10 consumer threads without issue. It shouldn't try to cover for a prolonged remote unavailability - if the user wants that, they should probably use the persistent queue instead.

Describe the solution you'd like
I'd like the value to be reduced to the low 100s, both 100 and 200 seem fine given the above napkin math.

Describe alternatives you've considered
If the default stays as is, I think we should include a warning about this in the documentation.

Additional context
This is especially dangerous given that the sending queue is enabled by default for a lot of exporters, including otlpexporter.

@swiatekm
Copy link
Contributor Author

From the SIG meeting at 26.04.2023:

  • reduce this a bit more gently, starting with 5000 -> 1000
  • look into reducing the default batch size as well

@mx-psi
Copy link
Member

mx-psi commented May 9, 2023

Fixed by #7592

@mx-psi mx-psi closed this as completed May 9, 2023
@dmitryax dmitryax reopened this Jun 20, 2023
@dmitryax
Copy link
Member

I believe it's not resolved yet. 1000 was the first step, but we want to reduce it further

@swiatekm
Copy link
Contributor Author

As per the SIG discussion on 21.06.2023, we'll leave this value alone, as it's likely to be replaced by a different key as we move batching to exporterhelper.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants