-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Issues: intel-analytics/ipex-llm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
IPEX-LLM 运行源2.0 M32量化版失败 on Intel ARC
multi-arc
user issue
#12082
opened Sep 14, 2024 by
jianweimama
vLLM 0.5.4 failure to start the TP+ PP mode on 8 ARC
multi-arc
user issue
#12081
opened Sep 14, 2024 by
oldmikeyang
LLama-33B failure with vLLM 0.5.4 docker on 4 ARC GPU.
multi-arc
user issue
#12079
opened Sep 14, 2024 by
oldmikeyang
Running vLLM service benchmark(4xARC770) with Qwen1.5-32B-Chat model failed
multi-arc
user issue
#11956
opened Aug 29, 2024 by
dukelee111
support inference AWQ INT4 model of Yi-34B from QLoRA
multi-arc
user issue
#11946
opened Aug 28, 2024 by
Fred-cell
failure to launch codegeex4-all-9b Using vllm
multi-arc
user issue
#11910
opened Aug 23, 2024 by
YongZhuIntel
ProTip!
Adding no:label will show everything without a label.