-
-
Notifications
You must be signed in to change notification settings - Fork 11k
[Model] Always use Transformers backend for PaliGemma and Gemma3-MM #26715
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
|
Documentation preview: https://vllm--26715.org.readthedocs.build/en/26715/ |
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request correctly identifies that the custom vLLM implementations for PaliGemma and Gemma3-MM do not properly handle their special attention masks. The solution to remove these custom implementations and fall back to the Hugging Face Transformers backend is a sound approach that prioritizes correctness. The changes are implemented thoroughly, with corresponding updates to model registries, documentation, and test suites. Notably, the removal of now-obsolete skipped tests and the addition of new tests for the Transformers backend demonstrate good testing practices. I find no high or critical issues in this pull request; it is a solid improvement.
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
|
@hmellor @zucchini-nlp Seems that |
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
|
Yeah, option with Though it'll be in v5 release with several breaking changes |
Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
…llm-project#26715) Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
…llm-project#26715) Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
#27309) Signed-off-by: Luciano Martins <lucianommartins@users.noreply.github.com> Co-authored-by: Luciano Martins <lucianommartins@users.noreply.github.com>
…mma3-MM impl… (vllm-project#27309) Signed-off-by: Luciano Martins <lucianommartins@users.noreply.github.com> Co-authored-by: Luciano Martins <lucianommartins@users.noreply.github.com> Signed-off-by: jorgentrondsen <jrtrondsen@gmail.com>
…mma3-MM impl… (vllm-project#27309) Signed-off-by: Luciano Martins <lucianommartins@users.noreply.github.com> Co-authored-by: Luciano Martins <lucianommartins@users.noreply.github.com> Signed-off-by: jorgentrondsen <jrtrondsen@gmail.com>
…mma3-MM impl… (vllm-project#27309) Signed-off-by: Luciano Martins <lucianommartins@users.noreply.github.com> Co-authored-by: Luciano Martins <lucianommartins@users.noreply.github.com> Signed-off-by: jorgentrondsen <jrtrondsen@gmail.com>
…mma3-MM impl… (vllm-project#27309) Signed-off-by: Luciano Martins <lucianommartins@users.noreply.github.com> Co-authored-by: Luciano Martins <lucianommartins@users.noreply.github.com>
…llm-project#26715) Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk> Signed-off-by: Alberto Perdomo <aperdomo@redhat.com>
…mma3-MM impl… (vllm-project#27309) Signed-off-by: Luciano Martins <lucianommartins@users.noreply.github.com> Co-authored-by: Luciano Martins <lucianommartins@users.noreply.github.com> Signed-off-by: Alberto Perdomo <aperdomo@redhat.com>
…llm-project#26715) Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk>
…llm-project#26715) Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk> Signed-off-by: xuebwang-amd <xuebwang@amd.com>
…llm-project#26715) Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk> Signed-off-by: xuebwang-amd <xuebwang@amd.com>
…mma3-MM impl… (vllm-project#27309) Signed-off-by: Luciano Martins <lucianommartins@users.noreply.github.com> Co-authored-by: Luciano Martins <lucianommartins@users.noreply.github.com>
…llm-project#26715) Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk> Signed-off-by: 0xrushi <6279035+0xrushi@users.noreply.github.com>
…mma3-MM impl… (vllm-project#27309) Signed-off-by: Luciano Martins <lucianommartins@users.noreply.github.com> Co-authored-by: Luciano Martins <lucianommartins@users.noreply.github.com> Signed-off-by: 0xrushi <6279035+0xrushi@users.noreply.github.com>
…llm-project#26715) Signed-off-by: DarkLight1337 <tlleungac@connect.ust.hk> Signed-off-by: 0xrushi <6279035+0xrushi@users.noreply.github.com>
…mma3-MM impl… (vllm-project#27309) Signed-off-by: Luciano Martins <lucianommartins@users.noreply.github.com> Co-authored-by: Luciano Martins <lucianommartins@users.noreply.github.com> Signed-off-by: 0xrushi <6279035+0xrushi@users.noreply.github.com>
…mma3-MM impl… (vllm-project#27309) Signed-off-by: Luciano Martins <lucianommartins@users.noreply.github.com> Co-authored-by: Luciano Martins <lucianommartins@users.noreply.github.com>
Purpose
Since vLLM doesn't support the special attention mask used by PaliGemma and Gemma3-MM (not to be confused with Gemma3n), this PR removes our custom implementations so Transformers backend is used for these models.
cc @hmellor
@NickLucche it would be great if you could test if gemma3 works with Transformers backend on TPU!
Test Plan
Transformers backend tests should pass.
Test Result
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.