Skip to content

Commit

Permalink
[Doc] Update docker references (vllm-project#5614)
Browse files Browse the repository at this point in the history
Signed-off-by: Rafael Vasquez <rafvasq21@gmail.com>
  • Loading branch information
rafvasq authored Jun 19, 2024
1 parent 8cd3ad1 commit 6a27407
Show file tree
Hide file tree
Showing 2 changed files with 13 additions and 14 deletions.
20 changes: 10 additions & 10 deletions docs/source/dev/dockerfile/dockerfile.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,19 +2,19 @@ Dockerfile
====================

See `here <https://github.com/vllm-project/vllm/blob/main/Dockerfile>`_ for the main Dockerfile to construct
the image for running an OpenAI compatible server with vLLM.
the image for running an OpenAI compatible server with vLLM. More information about deploying with Docker can be found `here <https://docs.vllm.ai/en/stable/serving/deploying_with_docker.html>`_.

- Below is a visual representation of the multi-stage Dockerfile. The build graph contains the following nodes:
Below is a visual representation of the multi-stage Dockerfile. The build graph contains the following nodes:

- All build stages
- The default build target (highlighted in grey)
- External images (with dashed borders)
- All build stages
- The default build target (highlighted in grey)
- External images (with dashed borders)

The edges of the build graph represent:
- FROM ... dependencies (with a solid line and a full arrow head)
- COPY --from=... dependencies (with a dashed line and an empty arrow head)
- RUN --mount=(.*)from=... dependencies (with a dotted line and an empty diamond arrow head)
The edges of the build graph represent:

- FROM ... dependencies (with a solid line and a full arrow head)
- COPY --from=... dependencies (with a dashed line and an empty arrow head)
- RUN --mount=(.*)from=... dependencies (with a dotted line and an empty diamond arrow head)

.. figure:: ../../assets/dev/dockerfile-stages-dependency.png
:alt: query
Expand Down
7 changes: 3 additions & 4 deletions docs/source/serving/deploying_with_docker.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,8 @@
Deploying with Docker
============================

vLLM offers official docker image for deployment.
The image can be used to run OpenAI compatible server.
The image is available on Docker Hub as `vllm/vllm-openai <https://hub.docker.com/r/vllm/vllm-openai/tags>`_.
vLLM offers an official Docker image for deployment.
The image can be used to run OpenAI compatible server and is available on Docker Hub as `vllm/vllm-openai <https://hub.docker.com/r/vllm/vllm-openai/tags>`_.

.. code-block:: console
Expand All @@ -25,7 +24,7 @@ The image is available on Docker Hub as `vllm/vllm-openai <https://hub.docker.co
memory to share data between processes under the hood, particularly for tensor parallel inference.


You can build and run vLLM from source via the provided dockerfile. To build vLLM:
You can build and run vLLM from source via the provided `Dockerfile <https://github.com/vllm-project/vllm/blob/main/Dockerfile>`_. To build vLLM:

.. code-block:: console
Expand Down

0 comments on commit 6a27407

Please sign in to comment.