Skip to content

[Misc] Enable vLLM to Dynamically Load LoRA from a Remote Server #10546

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 19 commits into from
Apr 15, 2025

Conversation

angkywilliam
Copy link
Contributor

@angkywilliam angkywilliam commented Nov 21, 2024

In the current implementation, vLLM only supports loading LoRA from local storage.

At OpenPipe, we are extending the serving engine's capabilities by introducing a LoRAResolver. LoRAResolver enables vLLM users to implement custom logic for fetching LoRA files from remote servers.

For example, in OpenPipe's case, we are dynamically loading LoRA for our customer from S3.

Copy link

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can do one of these:

  • Add ready label to the PR
  • Enable auto-merge.

🚀

@mergify mergify bot added the frontend label Nov 21, 2024
@angkywilliam angkywilliam changed the title Dynamic LoRA Resolver [Misc] Enable vLLM to Dynamically Load LoRA from a Remote Server Nov 21, 2024
@angkywilliam angkywilliam marked this pull request as ready for review November 21, 2024 20:49
@angkywilliam
Copy link
Contributor Author

angkywilliam commented Nov 21, 2024

@simon-mo @khluu can I get help to unblock buildkite, thanks!

@khluu
Copy link
Collaborator

khluu commented Nov 21, 2024

@simon-mo @khluu can I get help to unblock buildkite, thanks!

which test do you need to unblock? of if you want permission to unblock tests, can I have your email associated with your github?

@angkywilliam
Copy link
Contributor Author

angkywilliam commented Nov 22, 2024

@simon-mo @khluu can I get help to unblock buildkite, thanks!

which test do you need to unblock? of if you want permission to unblock tests, can I have your email associated with your github?

Hey @khluu , this is my first time opening PR to vLLM.

Do I need to pass all the test in BuildKite for the code to be reviewed and merged?
If yes, I would need help to unblock all the test in BuildKite.

Screenshot 2024-11-21 at 5 00 25 PM

I also send you email with subject Unblock Dynamically Load LoRA from a Remote Server for my email address.

Thanks for the help!

@jeejeelee
Copy link
Collaborator

jeejeelee commented Nov 22, 2024

IIUC, although this PR is related to LoRA loading, it seems you haven't touched the underlying LORA logic. What you might need is to add unit tests similar to #6566.
BTW, maybe you can refer to : https://github.com/vllm-project/vllm/blob/main/vllm/lora/worker_manager.py#L94

@khluu
Copy link
Collaborator

khluu commented Nov 22, 2024

@simon-mo @khluu can I get help to unblock buildkite, thanks!

which test do you need to unblock? of if you want permission to unblock tests, can I have your email associated with your github?

Hey @khluu , this is my first time opening PR to vLLM.

Do I need to pass all the test in BuildKite for the code to be reviewed and merged? If yes, I would need help to unblock all the test in BuildKite.

Screenshot 2024-11-21 at 5 00 25 PM

I also send you email with subject Unblock Dynamically Load LoRA from a Remote Server for my email address.

Thanks for the help!

Oh if it's just for merging, you can just wait until your PR is reviewed and approved, then your PR reviewer can run CI for you. Fastcheck is more like a subset of CI, running short tests and give you the flexibility to run extra specific test if needed (which you need to be in Buildkite org to do).

@DarkLight1337
Copy link
Member

@youkaichao do you think we can make this a plugin?

Copy link

mergify bot commented Jan 21, 2025

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @angkywilliam.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Jan 21, 2025
@corbt
Copy link

corbt commented Jan 24, 2025

I would love to see this merged in!

@youkaichao
Copy link
Member

For LoRA specific code, @jeejeelee has more context than me to review the code.

From the integration perspective, I think it's okay to create a lora resolver registry, and expose a function to register new lora resolvers. We don't need a new plugin category. General plugin should be enough to call the function to register new lora resolvers.

One technical detail is, we should collect all registered resolvers, and try them one by one to see which resolver succeeds. We don't need to let users select the resolver.

@angkywilliam
Copy link
Contributor Author

angkywilliam commented Mar 27, 2025

Thanks for the feedback @youkaichao

From what I understand so far, there are three main areas I'd need to modify for LoRA plugin support:

  1. lora/resolver.py
  • Introduce an abstraction for LoRAResolver
  • Add a LoraResolver registry to enable plugin registration
  1. entrypoints/openai/serving_engine.py
  • Update _check_model to iterate through the registered lora_resolvers after the lora_requests check
  1. Register the custom lora_resolver
  • This registration would happen outside of vLLM codebase, correct?

Does this sounds like a reasonable approach @youkaichao @jeejeelee?
Want to make sure I have the right direction, before implementing it.

Thanks in advance!

@jeejeelee
Copy link
Collaborator

@joerunde WDYT

@angkywilliam
Copy link
Contributor Author

angkywilliam commented Mar 29, 2025

I’ve made the concrete change here: #15733
Let me know if this make sense!

Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
@angkywilliam angkywilliam force-pushed the dynamic_lora_resolver branch from 7260dd4 to 397ed64 Compare April 1, 2025 03:12
@mergify mergify bot removed the needs-rebase label Apr 1, 2025
@angkywilliam
Copy link
Contributor Author

Updated the code with:

  • Merge latest code from main
  • Refactor lora resolver as a plugin per @youkaichao's input

Angky William added 2 commits April 9, 2025 16:23
Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Angky William added 3 commits April 11, 2025 18:02
Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
@mergify mergify bot added the documentation Improvements or additions to documentation label Apr 14, 2025
Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
@angkywilliam
Copy link
Contributor Author

angkywilliam commented Apr 14, 2025

@joerunde I’ve added the test for LoRA loading flow via LoRAResolver, along with the corresponding documentation.
I believe the code is ready to be merged — let me know if you have any additional feedback.

Copy link
Collaborator

@joerunde joerunde left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Thanks for the plugin support, @jeejeelee @youkaichao what do y'all think about getting this merged?

Angky William added 2 commits April 14, 2025 13:49
…test

Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
@simon-mo
Copy link
Collaborator

Can you add an example in the documentation for such plugin? so we have some examples people can leverage and knowing how to use it.

Angky William added 2 commits April 15, 2025 10:53
Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
@angkywilliam
Copy link
Contributor Author

@simon-mo I’ve added an example of the LoRAResolver implementation to the documentation

@simon-mo simon-mo enabled auto-merge (squash) April 15, 2025 20:40
@jberkhahn
Copy link
Contributor

also if you want to see what a resolver implementation would look like, i have my PR converted to a plugin up here: https://github.com/jberkhahn/filesystem_resolver

@simon-mo simon-mo merged commit fdcb850 into vllm-project:main Apr 15, 2025
49 checks passed
lionelvillard pushed a commit to lionelvillard/vllm that referenced this pull request Apr 17, 2025
…m-project#10546)

Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Co-authored-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
yangw-dev pushed a commit to yangw-dev/vllm that referenced this pull request Apr 21, 2025
…m-project#10546)

Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Co-authored-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Signed-off-by: Yang Wang <elainewy@meta.com>
jikunshang pushed a commit to jikunshang/vllm that referenced this pull request Apr 29, 2025
…m-project#10546)

Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Co-authored-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Apr 29, 2025
…m-project#10546)

Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Co-authored-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
…m-project#10546)

Signed-off-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Co-authored-by: Angky William <angkywilliam@Angkys-MacBook-Pro.local>
Signed-off-by: Mu Huai <tianbowen.tbw@antgroup.com>
@PedroMiolaSilva
Copy link

@angkywilliam hey! I've been trying to use LoRA resolver on local file system with no luck. I've detailed my attempt here: #18630. Any help is welcome :)

After it works locally, I want to try loading from s3.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation frontend ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants