Skip to content

Commit

Permalink
[Intel GPU] fix xpu not support punica kernel (which use torch.librar…
Browse files Browse the repository at this point in the history
…y.custom_op) (vllm-project#7685)
  • Loading branch information
jikunshang authored and omrishiv committed Aug 26, 2024
1 parent a7fb528 commit d03810a
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion vllm/lora/punica.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,10 @@
import torch

from vllm.triton_utils import HAS_TRITON
from vllm.utils import is_xpu

if HAS_TRITON:
# FIXME: xpu path doesn't support torch.library.custom_op
if HAS_TRITON and not is_xpu():
from vllm.lora.ops.bgmv_expand import bgmv_expand
from vllm.lora.ops.bgmv_expand_slice import bgmv_expand_slice
from vllm.lora.ops.bgmv_shrink import bgmv_shrink
Expand Down

0 comments on commit d03810a

Please sign in to comment.