Skip to content
This repository has been archived by the owner on Oct 11, 2024. It is now read-only.

rm model_executor/layers/attention directory since it's been moved #181

Merged
merged 1 commit into from
Apr 11, 2024

Conversation

tlrmchlsmth
Copy link
Member

@tlrmchlsmth tlrmchlsmth commented Apr 11, 2024

Looks like this directory was moved in vllm-project#3462, but the old directory has been hanging around for a while in our repo.

Copy link
Member

@andy-neuma andy-neuma left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cool

@tlrmchlsmth tlrmchlsmth merged commit 121857b into main Apr 11, 2024
2 checks passed
@tlrmchlsmth tlrmchlsmth deleted the tms/fixup_attention branch April 11, 2024 20:55
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants