-
Notifications
You must be signed in to change notification settings - Fork 954
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stable Diffusion 3.5 Large CUDA OUT_OF_MEMORY on RTX 3090 #2597
Comments
That seems odd, we made a couple optimizations to memory usage following #2574 and in the end, SD 3.5 large was reported to work well on a GPU with only 20GB of memory. Maybe there are some other processes using the memory? |
There are no other processes running. How would you like me to run nsys? Here's some system info: cat /etc/os-release
PRETTY_NAME="Ubuntu 24.04.1 LTS"
rustc --version
rustc 1.81.0 (eeb90cda1 2024-09-04)
cargo --version
cargo 1.81.0 (2dbb1af80 2024-08-20)
NVIDIA-SMI 560.35.03 Driver Version: 560.35.03 CUDA Version: 12.6
...
|
have you tried it with cudnn in addition to cuda feafture. I found it used less ram when cudnn was enabled. |
I have not. @LaurentMazare |
Not sure how much I would trust the memory usage reported by some external tool (especially here where it seems to only measure memory usage every 10s), it's probably safer to use nsys to get a proper memory profile. |
When I run
cargo run --example stable-diffusion-3 --release --features=cuda -- --which 3.5-large --prompt "pretty picture"
I am get
Error: DriverError(CUDA_ERROR_OUT_OF_MEMORY, "out of memory")
with Stable Diffusion 3.5 Large and Turbo.
According to this chart from stability.ai they should run on an RTX 3090.
The text was updated successfully, but these errors were encountered: