Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow batch size 1 by default #241

Closed
Dolyfin opened this issue Jun 5, 2024 · 1 comment
Closed

Allow batch size 1 by default #241

Dolyfin opened this issue Jun 5, 2024 · 1 comment

Comments

@Dolyfin
Copy link

Dolyfin commented Jun 5, 2024

Is your feature request related to a problem? Please describe.
Finetuning on RTX 3060 12GB with 20s length. Can only fit into VRAM with 1 Batch size. Using Grad 16 to counteract this and now fitting with 10.9GB VRAM usage.

Describe the solution you'd like
Just edit the min value in finetune.py

Additional context
I'm just lazy to go edit the values every install/update and would help more people with less VRAM to finetune. Add explanation in 'info - batch size' to turn up grad accumulation if batch size is low for optimal training.

@erew123
Copy link
Owner

erew123 commented Jun 10, 2024

Hi @Dolyfin Apologies for my very late reply. I was so deep in getting v2 out that I didnt break away from dealing with code and issues to get it out.

Ive yet to test out this PR Ive been sent here #242 but that may resolve some of the issues with VRAM. Though Ill take on board your suggestion for an explanation. Im guessing its on Linux you are having the issue.

Ill make a note of this in the Feature requests so that I dont lose track of it

Thanks

@erew123 erew123 closed this as completed Jun 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants