Skip to content

Conversation

@LucasWilkinson
Copy link
Collaborator

@LucasWilkinson LucasWilkinson commented Sep 24, 2025

@mergify mergify bot added the ci/build label Sep 24, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the vllm-flash-attention dependency to a newer commit to pick up several bug fixes. The change is straightforward and correct. I've added one suggestion to improve maintainability by documenting the reason for pinning this specific commit hash.

vllm-flash-attn
GIT_REPOSITORY https://github.com/vllm-project/flash-attention.git
GIT_TAG ee4d25bd84e0cbc7e0b9b9685085fd5db2dcb62a
GIT_TAG 4695e6bed5366c41e28c06cd86170166e4f43d00
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

For better maintainability, it's good practice to add a comment explaining what this Git commit hash corresponds to. This helps other developers understand why this specific commit is being used without having to look it up. You could mention the PRs this commit includes, as you did in the pull request description.

          # Includes fixes for FA3 build on CUDA 12.3, 12.4, and 12.5.
          # See https://github.com/vllm-project/flash-attention/pull/94
          GIT_TAG 4695e6bed5366c41e28c06cd86170166e4f43d00

Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
@LucasWilkinson LucasWilkinson added the ready ONLY add when PR is ready to merge/full CI is needed label Sep 30, 2025
@jmkuebler
Copy link
Contributor

@LucasWilkinson would you mind bumping directly to 07602adc2baa7a3ea037321ef2ca71c83096e5d5 to include vllm-project/flash-attention#96 as well?

@LucasWilkinson LucasWilkinson merged commit 418d111 into vllm-project:main Oct 2, 2025
81 checks passed
@LucasWilkinson
Copy link
Collaborator Author

@LucasWilkinson would you mind bumping directly to 07602adc2baa7a3ea037321ef2ca71c83096e5d5 to include vllm-project/flash-attention#96 as well?

It takes a while to get it through CI (need to populate sccache) so ill try to create a fast follow that includes:

vllm-project/flash-attention#96
vllm-project/flash-attention#95

👍 apologies for the the cumbersome process

pdasigi pushed a commit to pdasigi/vllm that referenced this pull request Oct 2, 2025
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
yewentao256 pushed a commit that referenced this pull request Oct 3, 2025
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
Signed-off-by: yewentao256 <zhyanwentao@126.com>
tomeras91 pushed a commit to tomeras91/vllm that referenced this pull request Oct 6, 2025
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
Signed-off-by: Tomer Asida <57313761+tomeras91@users.noreply.github.com>
southfreebird pushed a commit to southfreebird/vllm that referenced this pull request Oct 7, 2025
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 10, 2025
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
Signed-off-by: xuebwang-amd <xuebwang@amd.com>
lywa1998 pushed a commit to lywa1998/vllm that referenced this pull request Oct 20, 2025
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
alhridoy pushed a commit to alhridoy/vllm that referenced this pull request Oct 24, 2025
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
Signed-off-by: Lucas Wilkinson <lwilkins@redhat.com>
Signed-off-by: xuebwang-amd <xuebwang@amd.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ci/build ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants