-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
loss curve of llava-next-llama3 #12
Comments
@simplelifetime Could you share your loss curve with both llava-next-vicuna-7b and llava-next-llama3 |
+1, thanks |
Regarding your question about llama3, I am getting zero loss value in the fine-tuning stage. Did you also get the same loss values? {'loss': 1.9166, 'learning_rate': 2.0876826722338203e-08, 'epoch': 0.0} Following is my code for fine-tuning:
|
Thanks for your great work! I'm wondering if u can share the loss curve for training llava-next-llama3? I've observed some different behaviour compared to training llava-next-vicuna-7b. I'm wondering if it's normal or do I make some mistakes during training.
The text was updated successfully, but these errors were encountered: