Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

如何用DLCF-DCA模型的cdm/cdw去替换LCFS-BERT模型的cdm/cdw? #332

Open
ningmiaokai opened this issue Jul 5, 2023 · 5 comments

Comments

@ningmiaokai
Copy link

Unless you provide the REQUIRED information, your problem may not be addressed.

PyABSA Version (Required)

See the console output for PyABSA, Torch, Transformers Version

ABSADataset Version (Required if you use integrated datasets)

数据集用的是Twitter的数据集

Code To Reproduce (Required)


NameError Traceback (most recent call last)
/tmp/ipykernel_1084/1610265304.py in
305
306 if name == 'main':
--> 307 main()

/tmp/ipykernel_1084/1610265304.py in main()
301
302 ins = Instructor(opt)
--> 303 ins.run()
304
305

/tmp/ipykernel_1084/1610265304.py in run(self)
175
176 self._reset_params()
--> 177 best_model_path = self._train(criterion, optimizer, train_data_loader, val_data_loader)
178 self.model.load_state_dict(torch.load(best_model_path))
179 self.model.eval()

/tmp/ipykernel_1084/1610265304.py in _train(self, criterion, optimizer, train_data_loader, val_data_loader)
108
109 inputs = [sample_batched[col].to(self.opt.device) for col in self.opt.inputs_cols]
--> 110 outputs = self.model(inputs)
111 targets = sample_batched['polarity'].to(self.opt.device)
112

~/miniconda3/lib/python3.8/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1101 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1102 return forward_call(*input, **kwargs)
1103 # Do not call functions when jit is used
1104 full_backward_hooks, non_full_backward_hooks = [], []

/tmp/ipykernel_1084/3337720987.py in forward(self, inputs, output_attentions)
164
165 if self.opt.local_context_focus == 'cdm':
--> 166 masked_local_text_vec = self.get_dynamic_cdm_vec(opt,max_dist,text_local_indices, aspect_indices,aspect_begin,distances)
167 bert_local_out = torch.mul(bert_local_out, masked_local_text_vec)
168

NameError: name 'opt' is not defined

Full Console Output (Required)

The full console output, **text-only, no screen shots here

Describe the bug

A clear and concise description of what the bug is.

Expected behavior

我想用PyABSA中的DLCF-DCA的动态局部上下文聚焦机制,用这个cdw/cdm去替换LCFS-BERT中的cdw/cdm并进一步加以改进,但是替换以后一直报错。因为您前一个版本中config的位置是opt其他的都一某一样,我用opt或者config的都会出错。

Screenshots

下面是我替换掉以后的DLCF-BERT模型和报错情况
屏幕截图 2023-07-05 102340
屏幕截图 2023-07-05 102830
屏幕截图 2023-07-05 103235
屏幕截图 2023-07-05 103432
原来的LCFS-BERT模型
屏幕截图 2023-07-05 103658

@ningmiaokai
Copy link
Author

单句测试的时候其他带有config的函数均出错
屏幕截图 2023-07-05 131426
屏幕截图 2023-07-05 131604
屏幕截图 2023-07-05 133300

@yangheng95
Copy link
Owner

yangheng95 commented Jul 6, 2023

你是不是把https://github.com/XuMayi/DLCF-DCA 中的代码混用了,还是用的过时的PyABSA?最新的代码中已经没有opt这个用法了

@yangheng95
Copy link
Owner

但并不是简单的将opt命名为config就行,请使用这个库中的dlcf模型代码

@ningmiaokai
Copy link
Author

但并不是简单的将opt命名为config就行,请使用这个库中的dlcf模型代码
非常感谢您,还有一个就是我直接训练您的DLCF-DCA模型为什么没有结果?训练必须要安装PyABSA吗?我是基于pyabsa==1.1.24这个版本的,我想参考您的模型的一部分,这个也需要安装这个包吗?训练需要用那一块的代码呢(输出F1,acc这些指标)?我用apc_trainer.py文件了。

@yangheng95
Copy link
Owner

考虑clone这个库的最新代码,然后全部基于这个库的代码修改。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants