Next steps for the finetuner #13835
WilliamTambellini
started this conversation in
Ideas
Replies: 2 comments 4 replies
-
Yes, I would review and merge such PRs. Since you are already using the code, what are you doing for quality control? |
Beta Was this translation helpful? Give feedback.
4 replies
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi @JohannesGaessler
We have tested the finetuner and managed to finetune some very small LLMs (SLM?), congrats and tks.
Here are some ideas of improvements we would need and could work on:
1- add option to finetune using SGD
2- support bf16 finetuning
3- support finetuning only on the last n layers
Would you accept any PR on any of these ?
Best
WT
cf @ggerganov
Beta Was this translation helpful? Give feedback.
All reactions