Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

prologue uses more memory per request for static files #106

Closed
ITwrx opened this issue Nov 20, 2020 · 4 comments
Closed

prologue uses more memory per request for static files #106

ITwrx opened this issue Nov 20, 2020 · 4 comments

Comments

@ITwrx
Copy link

ITwrx commented Nov 20, 2020

If serving a static image via img tag or especially a video via html5 video tag, prologue uses more and more memory per request. With a video in the page, you can quickly be using 1GB of ram.

This also happens with other nim web frameworks, so it's likely something to do with the underlying http server implementations.

However, this is easily mitigated by using nginx in front of prologue as a reverse proxy, and specifically, by explicitly serving the static files with nginx via a location block as demonstrated below.

server {
    server_name prologue-test;

    location ^~/static/ {
	root /var/www/prologue-test;
	try_files $uri $uri/ ;
	expires 1y;
	access_log off;
    }

    location / {
	proxy_pass http://localhost:8080;
	proxy_set_header Host $host;
	proxy_set_header X-Real_IP $remote_addr;
    }

}

If testing on localhost, you will want to create a line in /etc/hosts for the server_name you are using in your nginx config, and use an env variable for the host name in your app's urls, so that all your static resources use that domain/server name instead of localhost:8080. Otherwise, the requests would bypass nginx and be served by prologue.

Please note that the above nginx config is a minimal example and should not be considered complete for production use, as you will need an http block with various settings for performance and security, and likely other lines in this server block. The example above is just to demonstrate the part related to this workaround.

@jivank
Copy link
Contributor

jivank commented Dec 3, 2020

Is this also the case when you compile with -d:release -d:usestd?

usestd will use asynchttpserver from the standard library which may not have this issue.

@ITwrx
Copy link
Author

ITwrx commented Dec 3, 2020

@jivank i don't remember if i tried a release build, but yes i tested with --usestd and this problem exists when using asynchttpserver, httpx, and/or httpbeast(with jester), for that matter. It's easy to test by serving a large file like a video directly with the framework (without nginx/apache in front) and any underlying http server library you want to test. With a video, the ram usage will climb very fast. You can view it easily with ps_mem.

@jivank
Copy link
Contributor

jivank commented Feb 7, 2021

@ITwrx Please check again with the latest prologue and nim 1.4.2, I am serving a 7GB file and I seem to be getting around 10MB now with -d:usestd

@ITwrx
Copy link
Author

ITwrx commented Feb 28, 2021

@jivank thanks for the "heads up".

A couple of things:

  1. this seems to be fixed. Mem usage did not go up after stabilizing at 9.5MB as reported by ps_mem when testing with nim 1.4.2 and prologue 0.4.4 directly with the nim asynchttpserver on http://localhost:8080.
  2. when building and running simultaneously with nim c -d:ssl -d:usestd -r ITwrxPrologueNew.nim --gc:orc --threads:on it used 255MB, but didn't climb per page or static resource loaded.
    When building, then running, separately, with nim c -d:ssl -d:usestd ITwrxPrologueNew.nim then ./ITwrxPrologeNew --gc:orc --threads:onit uses 9.5MB according to ps_mem. I don't remember noticing this before, so i add it here in case it helps someone.
  3. it might still be advisable to serve static resources directly with nginx (or other reverse proxy + web server combo) for performance reasons, but you don't technically have to to avoid this memory usage issue anymore.

Thanks nim and prologue devs!

@ITwrx ITwrx closed this as completed Feb 28, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants