Skip to content

[Bug]: BERT蒸馏至BiLSTM的数据增强是不是有问题 #9816

Open
@aixingxy

Description

软件环境

- paddlepaddle-gpu: 2.6.2
- paddlenlp: 3.0.0b3

重复问题

  • I have searched the existing issues

错误描述

https://github.com/PaddlePaddle/PaddleNLP/blob/2f85a64edd4aa9911c94ccb5ce53e83ac41ce22b/slm/examples/model_compression/distill_lstm/data.py#L97
在94行的for循环中data_list相同的,data_list值是85行for循环最后一次的结果

稳定复现步骤 & 代码

None

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions