Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fail to load the model from ./checkpoints/ATEPC_ENGLISH_CHECKPOINT! #340

Open
khorg0sh opened this issue Jul 23, 2023 · 6 comments
Open

Comments

@khorg0sh
Copy link

I'm using Google Colab. As was suggested, the version was downgraded to 1.16.27.
I get the following error while loading the model :

RuntimeError: Exception: Error(s) in loading state_dict for FAST_LCF_ATEPC:
Unexpected key(s) in state_dict: "bert4global.embeddings.position_ids". Fail to load the model from ./checkpoints/ATEPC_ENGLISH_CHECKPOINT!

Thanks in advance.

@yangheng95
Copy link
Owner

yangheng95 commented Jul 23, 2023

freeze transformers==4.29.0

@ImSanjayChintha
Copy link

I am having same issue, freezing is also not working and all pyabsa versions. Please help me to fix that. @yangheng95

@yangheng95
Copy link
Owner

I am having same issue, freezing is also not working and all pyabsa versions. Please help me to fix that. @yangheng95

Hi! Please report your problem in details:
https://github.com/yangheng95/PyABSA/issues/new?assignees=&labels=&projects=&template=bug_report.md&title=

@ImSanjayChintha
Copy link

Below installations are worked for me

!pip install pyabsa==1.16.24
!pip install transformers==4.29.0

@SupritYoung
Copy link

I think you need to update your code to compatible with new version transformers.

@yangheng95
Copy link
Owner

I think you need to update your code to compatible with new version transformers.

Hi, are you using v1.x?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants