Skip to content

Commit

Permalink
re-enable avoid torch slice fix when chunked prefill is disabled (ope…
Browse files Browse the repository at this point in the history
  • Loading branch information
sanyalington authored Sep 26, 2024
1 parent cc2039c commit a5d87a1
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion vllm/attention/backends/rocm_flash_attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -573,7 +573,7 @@ def forward(
else:
out = output
ops.paged_attention_rocm(
output[num_prefill_tokens:],
out,
exp_sums,
max_logits,
tmp_output,
Expand Down

0 comments on commit a5d87a1

Please sign in to comment.