Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't be installed using "uv" due to an issue in the setup script #833

Open
halflings opened this issue Feb 17, 2024 · 7 comments
Open

Can't be installed using "uv" due to an issue in the setup script #833

halflings opened this issue Feb 17, 2024 · 7 comments

Comments

@halflings
Copy link

See astral-sh/uv#1582 (comment) for full context, but flash-attn can't be installed because it references packaging at the top of the setup script, then lists it as a dependency lower in the script:
https://github.com/Dao-AILab/flash-attention/blob/5cdabc2809095b98c311283125c05d222500c8ff/setup.py#L9C6-L9C15

Instead packaging cannot be used and be a dependency at the same time.

@tridao
Copy link
Contributor

tridao commented Feb 17, 2024

I'm not sure how this works, do you have a suggestion?

@Qubitium
Copy link
Contributor

Perhaps packaging can be moved to setup_requires and not install_requires. If the pkg is only used for setup.py, it should be moved to setup_requires.

My PR uses this to avoid having psutil installed to the final pkg since it is only part of setup.

https://github.com/Dao-AILab/flash-attention/pull/832/files

@Qubitium
Copy link
Contributor

@tridao If flash-attn does not JIT but only precompile during setup, it may be best to move all non-runtime depends in setup.py to setup_requires

@Taytay
Copy link

Taytay commented Feb 19, 2024

Related to (or almost a dupe of) #453 I believe.

@Taytay
Copy link

Taytay commented Feb 19, 2024

I've been investigating this for the past hour or two, and here's what I have learned:

Short version:

The best flash-attn can do is detect that it's running in build-isolated mode, and that its required modules (like packaging, torch, etc) are missing, and give a nice error message directing people to installation instructions. This isn't really a bug with its setup script, and there isn't a way to solve it with proper setup config!

Long version:

(Background : This is all due to build isolation, which is enabled by pip most of the time when installing python modules. flash-attn needs you to pip install with --no-build-isolation primarily so that things like torch can just be referenced during setup/compilation. If you don't use --no-build-isolation, setup.py runs in a totally clean environment, devoid of things like setuptools, packaging, pytorch, etc. This is fine for most packages, because they can just declare (in pyproject.toml) all of the modules they need, and those will get installed before the setup.py is run. Cool! However, flash-attn can't be expected to just list torch as one of its setup dependencies. That means that pytorch would get reinstalled as part of flash-attn's installation process. But flash-attn just wants to reference whatever Pytorch is already installed.)

You could put packaging inside of setup_requires, but by the time that runs, I believe it's too late! setup.py is already running! You can't list modules there that you need installed at the beginning of setup.py (from what I read anyway). I experimented with this by creating a pyproject.toml file at the root of the flash-attn folder, (which forces pip install -e . to run in isolated mode), and sure enough, whether I have packaging in setup_requires or not, setup.py can't find the packaging module.

You can move stuff like packaging into the pyproject.toml file:

[build-system]
requires = ["wheel", "packaging", "ninja"]
build-backend = "setuptools.build_meta"

But that doesn't work either because you won't have torch or other large modules available to you at setup time.

So I think the best course of action for flash-attn is to have some nice code at the top that does stuff like this:

try:
    from packaging.version import parse, Version
except ImportError:
    # The inability to import packaging.version is likely due to running with build isolation
    print(
        """Could not import the 'packaging' module. You likely forgot to install without the "--build-isolation" flag. Follow the installation instructions here: https://github.com/Dao-AILab/flash-attention?tab=readme-ov-file#installation-and-features"""
    )
    exit(1)

There are a few places you'd need to do that conditional import or check for a module: packaging, torch, psutil, ninja.
But nice checks and error messages like this would get rid of confusion and common bug reports like this (that bit me too!): #453

I wrote up similar findings for uv here when I requested they add support for --no-build-isolation:
astral-sh/uv#1715

@fxmarty
Copy link

fxmarty commented Mar 28, 2024

It's wild that pypa/pip#8437 is locked.

@tyler-romero
Copy link

FWIW, a pyproject.toml like the following works for me in uv's project flow (uv lock, uv sync, uv run).

[project]
name = "asdf"
version = "0.1.0"
description = "asdf"
authors = [
    { name = "me", email = "me@gmail.com" }
]
requires-python = "~=3.11"
dependencies = [
    "bitsandbytes~=0.43",
    "datasets~=2.20",
    "devtools>=0.12.2",
    "numpy~=1.26",
    "optuna~=3.6",
    "pandas>=2.2.2",
    "peft~=0.11",
    "pillow~=10.4",
    "pydantic~=2.8",
    "schedulefree>=1.2.6",
    "scikit-learn~=1.5",
    "scipy~=1.14",
    "timm~=1.0",
    "torch==2.3.0",
    "torchvision~=0.18.0",
    "transformers~=4.42",
    "triton~=2.3.0",
    "wandb>=0.17.4",
    "flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.3cxx11abiFALSE-cp311-cp311-linux_x86_64.whl"
]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants