You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have 6 4090 GPUs (VRAM = 120GB). However, when I try to finetune the model, it shows "CUDA out of memory" error.
How much VRAM is needed to train the ViT backbone model? I want to know how many GPUs you had when you pretrain the model.
The text was updated successfully, but these errors were encountered:
I have 6 4090 GPUs (VRAM = 120GB). However, when I try to finetune the model, it shows "CUDA out of memory" error.
How much VRAM is needed to train the ViT backbone model? I want to know how many GPUs you had when you pretrain the model.
The text was updated successfully, but these errors were encountered: