-
Notifications
You must be signed in to change notification settings - Fork 29.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
event emitter memory leak on flush #3529
Comments
By the way, I'm unsure if flush() just inserts a marker/barrier in the stream, so that whatever stream reader that is chained to the one I'm writing to will see the marker and know that it's a good time to flush when it hits the marker? I think I've been assuming that write() and flush() cooperate approximately in such a way. But maybe it's more intricate, and flush() actually causes some processing to happen, and I have to make write() and flush() cooperate manually, for example by holding off further write()s until the flush() has returned by executing a callback? If anyone knows... |
@dougwilson could you detail the core bug a bit more? |
you're probably just calling flush too much. each flush call waits for a drain, and a drain probably never happened. any reason you don't just use an optimization in node could maybe be knowing when a flush is in progress so it doesn't continuously add more listeners |
The bug is that every call to The history and the fix can be found in nodejs/node-v0.x-archive#25679 |
Hmm, I'm calling flush() the minimum necessary amount of times, as far as I can tell... (Browser requests data from a starting point in time; server sends all data from that point in time and up to the most recent, after which it will flush() once; then it sends events as they occur in the future, flushing once after each event.)
Could be. Are the listeners necessary?
Is this question meant for the compress module developers? |
|
Fix has landed on master |
I'm using SSE streams to push JSON events to a web browser.
There is a res.flush() after every bundle of SSE data, to make sure the client gets data.
(Every 15 seconds there's also a keep-alive function that sends a ": keep-alive\n" comment over the channel, along with a res.flush(), to keep stateful firewalls from closing the TCP connection while the browser window showing graphs etc. is open.)
Also using require('compression'), since the SSE streams are usually heavy with data in the beginning before then slowing down.
For some reason, res.flush() causes an error from node:
I couldn't find response.flush() documented in the v4.2.1 NodeJS documentation, so I might be calling it incorrectly?
As far as I could read from the compression module documentation, I'm using it correctly though..
According to the folks over at compression, this is not a bug in the compression library, but in nodejs:
expressjs/compression#58 (comment)
What is your opinion?
The text was updated successfully, but these errors were encountered: