We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 89a78be commit 057411cCopy full SHA for 057411c
src/transformers/modeling_longformer.py
@@ -311,7 +311,7 @@ def forward(
311
# is index masked or global attention
312
is_index_masked = attention_mask < 0
313
is_index_global_attn = attention_mask > 0
314
- is_global_attn = any(is_index_global_attn.flatten())
+ is_global_attn = is_index_global_attn.flatten().any().item()
315
316
hidden_states = hidden_states.transpose(0, 1)
317
0 commit comments