-
Notifications
You must be signed in to change notification settings - Fork 807
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak when using more than one MetricReader #4115
Comments
I could not for the life of me, remember why i never had this issue before, so had a look in some old code i wrote a few years back that also implemented otel. And back then I wrote my own MultiSpanExporter and MultiMetricExporter classes, so I am not registering multiple exporters, I simply handle doing the export from multiple exports, in a wrapper class. |
I created a minimal reproducer repository for it. It looks like a memory leak is in the metric collection/export pipeline; memory leaks quicker when the collection interval is shorter. |
Self-assigning this because I think I've finally gotten to the bottom of this. It looks like we're keeping track of unreported accumulations in I've tried just collecting with one I'll see to have a PR ready for this by the end of this week/early next week. |
For anyone affected by this, here is a very crude work around ... |
What happened?
Steps to Reproduce
Add two addMetricReader's to a meterProvider and see memory explode
Expected Result
A reasonable amount of memory usage, preferably the same as only having one
Actual Result
a very fast groth of ram usage
Additional Details
Using the @opentelemetry/api 1.4.1 i began seeing massive spikes in memory usage, but only on some systems, after some digging around, i found it was only on systems with more than one opentelemetry collector.
That has not been an issue before, so i down graded to
that cut the memory usage in half, but it was it is still leaking memory
The graph below is memory usage for 3 runs.
First run is with newest package versions
Second run is with older package versions
last run is with only one MetricReader
While gathering data and graphs for this issue, i had the idea of removing AsyncHooksContextManager since I'm not really using that right now.
I tried adding 2 MetricReader, update to latest version of the packages, but remove AsyncHooksContextManager. That seems to reduce the memory usage drastically, but I also see memory usage is slowly creeping up over time, so it did not solve the issue. ( here compared with the above graphs )
The project this is code is running in, is super basic. It will start 1 or 2 new processes, and restarting them, if they die. so it should not really be using any ram og cpu.
OpenTelemetry Setup Code
package.json
Relevant log output
No response
The text was updated successfully, but these errors were encountered: