-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Closed
Description
Here's the revised GitHub issue including the error logs you provided:
Title:
🚨 [Bug] Windows - mamba-ssm installation fails with missing triton dependency and --no-build-isolation failure
Environment:
- OS: Windows 11 Pro
- Python version:
3.11.5
- CUDA version:
12.4
- PyTorch version:
2.5.1+cu121
- Torchvision version:
0.20.1+cu121
- Torchaudio version:
2.5.1+cu121
- Visual Studio: VS 2022 Community 14.39.33519
- CUDA Compiler (
nvcc
): 12.4.131 - Conda Environment:
MasterThesis
Description of the Issue:
I am attempting to install mamba-ssm
on a Windows machine with a correctly configured CUDA and PyTorch environment. However, the installation fails with a missing triton
dependency and build errors, even when using the --no-build-isolation
and --no-deps
flags.
Steps to Reproduce:
- Create a fresh Conda environment:
conda create -n test_env python=3.11 conda activate test_env
- Confirm CUDA installation:
nvcc --version # CUDA 12.4 confirmed
- Install PyTorch with CUDA support:
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121
- Attempt to install
mamba-ssm
using--no-build-isolation
:pip install mamba-ssm --no-build-isolation --no-deps
- Clone and install directly from GitHub:
git clone https://github.com/state-spaces/mamba.git cd mamba pip install . --no-build-isolation --no-deps
Observed Errors:
Error log during installation with --no-build-isolation
:
Building wheels for collected packages: mamba-ssm
Building wheel for mamba-ssm (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for mamba-ssm (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [75 lines of output]
torch.__version__ = 2.5.1+cu121
running bdist_wheel
UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend.
Guessing wheel URL: https://github.com/state-spaces/mamba/releases/download/v2.2.4/mamba_ssm-2.2.4+cu12torch2.5cxx11abiFALSE-cp311-cp311-win_amd64.whl
Precompiled wheel not found. Building from source...
...
c1xx: fatal error C1083: Cannot open source file: 'csrc/selective_scan/selective_scan.cpp': No such file or directory
error: command 'cl.exe' failed with exit code 2
Key Issues Noted:
triton
dependency could not be resolved.- Source files (
csrc/selective_scan/selective_scan.cpp
) missing in the build. - The
--no-build-isolation
flag does not prevent the error. - Potential version mismatch:
The detected CUDA version (12.4) has a minor version mismatch with the version that was used to compile PyTorch (12.1). Most likely this shouldn't be a problem.
What I've Tried:
- ✅ Confirmed CUDA installation and compatibility (
nvcc 12.4
). - ✅ Verified
torch
andtorch.cuda.is_available()
returnTrue
. - ✅ Installed
triton
manually:pip install triton pip install triton==2.1.0
- ✅ Attempted installation from both PyPI and the GitHub repository.
- ✅ Upgraded
setuptools
andwheel
:pip install --upgrade setuptools wheel
Expected Behavior:
pip install mamba-ssm
should correctly build the package without missing source files.- The
--no-build-isolation
flag should prevent the CPU version of PyTorch from being reinstalled.
Possible Causes Identified:
- The
csrc
directory may be missing or improperly referenced in thepyproject.toml
. triton
might not be fully compatible with Windows or the latest CUDA version.- The
--no-build-isolation
flag might not be functioning correctly in thesetuptools
configuration.
Request for Assistance:
- Is the
mamba-ssm
package officially supported for Windows? - Should I consider downgrading CUDA or PyTorch for compatibility?
- Is there an alternative way to bypass the
triton
dependency for a CPU-only installation?
System Information (pip freeze
):
filelock==3.13.1
fsspec==2024.2.0
Jinja2==3.1.3
MarkupSafe==2.1.5
mpmath==1.3.0
networkx==3.2.1
numpy==2.2.1
pillow==10.2.0
sympy==1.13.1
torch==2.5.1+cu121
torchaudio==2.5.1+cu121
torchvision==0.20.1+cu121
typing_extensions==4.9.0
🙏 Thank you for your time and assistance!
Please let me know if you need any additional information or test results.
Metadata
Metadata
Assignees
Labels
No labels