-
Notifications
You must be signed in to change notification settings - Fork 160
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
While training my custom data for ASTEPC got weight mismatch error #310
Comments
Please provide useful information according to the report: https://github.com/yangheng95/PyABSA/issues/new?assignees=&labels=&template=bug_report.md&title= |
Version Describe the bug
Sample data:
Code To Reproduce
Expected behavior Screenshots
|
please pip install pyabsa -U and see if it is repaired |
I have reinstalled as you said and the result has not changed. I still get error.
Currently Also, you wrote in the documentation here:
Should I change "hidden dim and embed_dim manually" if it will solve the problem, and if so how can I do that?
Note: |
This is a known issue caused by transformers breaking change, which version of pyabsa do you use? |
When training ASTEPC model with both my custom and predefined datasets but giving below error
I followed the following notebook:
https://github.com/yangheng95/PyABSA/blob/v2/examples-v2/aspect_term_extraction/Aspect_Term_Extraction.ipynbhttps://github.com/yangheng95/PyABSA/blob/v2/examples-v2/aspect_term_extraction/Aspect_Term_Extraction.ipynb
RuntimeError: Error(s) in loading state_dict for FAST_LCF_ATEPC:
size mismatch for bert4global.embeddings.word_embeddings.weight: copying a param with shape torch.Size([251000, 768]) from checkpoint, the shape in current model is torch.Size([128100, 768]).
The text was updated successfully, but these errors were encountered: