Skip to content

Commit

Permalink
fix lint
Browse files Browse the repository at this point in the history
  • Loading branch information
simon-mo committed Mar 16, 2024
1 parent cf6ff18 commit ad50bf4
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/ruff.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,4 +31,4 @@ jobs:
ruff vllm tests
- name: Spelling check with codespell
run: |
codespell --toml pyproject.toml
codespell --toml pyproject.toml
4 changes: 2 additions & 2 deletions tests/kernels/test_prefix_prefill.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,8 @@ def test_contexted_kv_attention(
torch.cuda.manual_seed(0)
torch.set_default_device(device)

# Need this, otherwise when we capture the graph the process for GPU 1 would run on both
# GPU0 and GPU1 and things would hang
# Need this, otherwise when we capture the graph the process for GPU 1 would
# run on both GPU0 and GPU1 and things would hang
#
# see also similar issue: https://github.com/Dao-AILab/flash-attention/issues/523
torch.cuda.set_device(device)
Expand Down

0 comments on commit ad50bf4

Please sign in to comment.