Skip to content

Commit

Permalink
[MLU] Fix Llama attrntion_mask in npu and mlu (#9075)
Browse files Browse the repository at this point in the history
* fix
  • Loading branch information
DrownFish19 authored Sep 3, 2024
1 parent 4e7fb49 commit 9939f84
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion paddlenlp/transformers/llama/modeling.py
Original file line number Diff line number Diff line change
Expand Up @@ -1653,7 +1653,7 @@ def forward(
is_casual = True
else:
is_casual = is_casual_mask(attention_mask)
if get_env_device() != "npu" or get_env_device() != "mlu":
if get_env_device() not in ["npu", "mlu"]:
if is_casual and alibi is None:
attention_mask = None
else:
Expand Down

0 comments on commit 9939f84

Please sign in to comment.