Skip to content

Environment about flash attention #5

Open
@shenqiaos

Description

@shenqiaos

Thanks for your excellent Work in this area firstly.
When I tried to inference the model, the error as following:

 File "/mnt/HDD/home_combine/qwx/Prompt4Driving/projects/mmdet3d_plugin/models/utils/attention.py", line 16, in <module>
    from flash_attn.flash_attn_interface import flash_attn_unpadded_kvpacked_func
ModuleNotFoundError: No module named 'flash_attn'

flash_attn need CUDA>=11.7, but the project's CUDA ==11.1.

Collecting flash_attn
  Using cached flash_attn-2.7.0.post2.tar.gz (2.7 MB)
  Preparing metadata (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py egg_info did not run successfully.
  │ exit code: 1
  ╰─> [12 lines of output]
      fatal: not a git repository (or any of the parent directories): .git
      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "/tmp/pip-install-r4uz0u_i/flash-attn_32a89c6d2cd64a458e47c873c4dc9a8a/setup.py", line 164, in <module>
          raise RuntimeError(
      RuntimeError: FlashAttention is only supported on CUDA 11.7 and above.  Note: make sure nvcc has a supported version by running nvcc -V.
      
      
      torch.__version__  = 1.9.0+cu111
      
      
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

Could you tell me, how to deal with this problem?

Activity

wudongming97

wudongming97 commented on Nov 30, 2024

@wudongming97
Owner

We used a very early version of flash-attention:
https://github.com/Dao-AILab/flash-attention/tree/v0.2.2

You can try to install it locally.

shenqiaos

shenqiaos commented on Nov 30, 2024

@shenqiaos
Author

Thank you for your prompt reply. I will try it out

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      Environment about flash attention · Issue #5 · wudongming97/Prompt4Driving