Skip to content

Conversation

@tjhunter
Copy link
Collaborator

Closes #57

This PR ensures that the build environment is (mostly) reproducible:

  • save version of flash-attn
  • pinned version of torch, cuda, ...

There should be no need for extra setup of flash-attn. I removed all caching and virtual environments, and uv downloaded the right dependencies.

@clessig
Copy link
Collaborator

clessig commented Mar 14, 2025

I think we should then also update here:

%>uv pip install flash_attn --no-build-isolation

@clessig clessig self-requested a review March 14, 2025 13:57
@tjhunter
Copy link
Collaborator Author

@clessig thank you for pointing out the install.md! Merging now.

@tjhunter tjhunter merged commit f218fa1 into develop Mar 14, 2025
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

Reproducible instructions to install flash attention on all working clusters

3 participants