You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fromvllmimportLLM, SamplingParams# Sample prompts.prompts= [
"Hello, my name is",
"The president of the United States is",
"The capital of France is",
"The future of AI is",
]
# Create a sampling params object.sampling_params=SamplingParams(temperature=0.8, top_p=0.95)
# Create an LLM.llm=LLM(model="facebook/opt-125m")
# Generate texts from the prompts. The output is a list of RequestOutput objects# that contain the prompt, generated text, and other information.outputs=llm.generate(prompts, sampling_params)
# Print the outputs.foroutputinoutputs:
prompt=output.promptgenerated_text=output.outputs[0].textprint(f"Prompt: {prompt!r}, Generated text: {generated_text!r}")
Or any other way to use vllm gives this error
(nvenv) (nvenv) [prince@pc minimal-flask-api]$ python3 test.py
Traceback (most recent call last):
File "/home/prince/Desktop/task/minimal-flask-api/test.py", line 33, in<module>
llm = LLM(model="facebook/opt-125m")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/prince/Desktop/task/minimal-flask-api/nvenv/lib/python3.11/site-packages/vllm/entrypoints/llm.py", line 109, in __init__
self.llm_engine = LLMEngine.from_engine_args(engine_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/prince/Desktop/task/minimal-flask-api/nvenv/lib/python3.11/site-packages/vllm/engine/llm_engine.py", line 386, in from_engine_args
engine_configs = engine_args.create_engine_configs()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/prince/Desktop/task/minimal-flask-api/nvenv/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 286, in create_engine_configs
device_config = DeviceConfig(self.device)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/prince/Desktop/task/minimal-flask-api/nvenv/lib/python3.11/site-packages/vllm/config.py", line 496, in __init__
raise RuntimeError("No supported device detected.")
RuntimeError: No supported device detected.
OS: Manjaro
16gb i7 11th gen
No GPU
the model was not downloaded locally
The text was updated successfully, but these errors were encountered:
Trying to run this standard example
Or any other way to use vllm gives this error
OS: Manjaro
16gb i7 11th gen
No GPU
the model was not downloaded locally
The text was updated successfully, but these errors were encountered: