Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How much VRAM is needed to finetune co_dino_5scale_vit_large_coco? #167

Open
hee-dongdong opened this issue Sep 13, 2024 · 1 comment
Open

Comments

@hee-dongdong
Copy link

I have 6 4090 GPUs (VRAM = 120GB). However, when I try to finetune the model, it shows "CUDA out of memory" error.
How much VRAM is needed to train the ViT backbone model? I want to know how many GPUs you had when you pretrain the model.

@TempleX98
Copy link
Collaborator

We use 56 A100 80G GPUs to pretrain the model. FSDP and deepspeed can help you reduce training memory consumption, please refer to this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants