Skip to content

Modify installation.md for adding pip extra index of torch-npu #1272

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jun 23, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 6 additions & 2 deletions docs/source/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -123,10 +123,15 @@ apt update -y
apt install -y gcc g++ cmake libnuma-dev wget git
```

**[Optional]** Config the extra-index of `pip` if you are working on a **x86** machine, so that the torch with cpu could be found:
**[Optional]** Then config the extra-index of `pip` if you are working on a x86 machine or using torch-npu dev version:

```bash
# For x86 machine
pip config set global.extra-index-url https://download.pytorch.org/whl/cpu/
# For torch-npu dev version
pip config set global.extra-index-url https://mirrors.huaweicloud.com/ascend/repos/pypi
# For x86 torch-npu dev version
pip config set global.extra-index-url "https://download.pytorch.org/whl/cpu/ https://mirrors.huaweicloud.com/ascend/repos/pypi"
```

Then you can install `vllm` and `vllm-ascend` from **pre-built wheel**:
Expand Down Expand Up @@ -156,7 +161,6 @@ cd ..
# Install vLLM Ascend
git clone --depth 1 --branch |vllm_ascend_version| https://github.com/vllm-project/vllm-ascend.git
cd vllm-ascend
export PIP_EXTRA_INDEX_URL=https://mirrors.huaweicloud.com/ascend/repos/pypi
pip install -v -e .
cd ..
```
Expand Down