Skip to content

Commit

Permalink
[Docs] Add notes on ROCm-supported models (vllm-project#2087)
Browse files Browse the repository at this point in the history
  • Loading branch information
WoosukKwon authored Dec 13, 2023
1 parent 6565d9e commit 096827c
Showing 1 changed file with 10 additions and 3 deletions.
13 changes: 10 additions & 3 deletions docs/source/models/supported_models.rst
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,10 @@ If your model uses one of the above model architectures, you can seamlessly run
Otherwise, please refer to :ref:`Adding a New Model <adding_a_new_model>` for instructions on how to implement support for your model.
Alternatively, you can raise an issue on our `GitHub <https://github.com/vllm-project/vllm/issues>`_ project.

.. note::
Currently, the ROCm version of vLLM does not support Mixtral.
Additionally, it only supports Mistral for context lengths up to 4096.

.. tip::
The easiest way to check if your model is supported is to run the program below:

Expand All @@ -84,18 +88,21 @@ Alternatively, you can raise an issue on our `GitHub <https://github.com/vllm-pr
output = llm.generate("Hello, my name is")
print(output)
To use model from www.modelscope.cn
If vLLM successfully generates text, it indicates that your model is supported.

.. tip::
To use models from `ModelScope <www.modelscope.cn>`_ instead of HuggingFace Hub, set an environment variable:

.. code-block:: shell
$ export VLLM_USE_MODELSCOPE=True
And use with :code:`trust_remote_code=True`.

.. code-block:: python
from vllm import LLM
llm = LLM(model=..., revision=..., trust_remote_code=True) # Name or path of your model
output = llm.generate("Hello, my name is")
print(output)
If vLLM successfully generates text, it indicates that your model is supported.

0 comments on commit 096827c

Please sign in to comment.