Skip to content

Upgrade transformers to v4.50.3 #13905

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 18 commits into from
Mar 31, 2025
Merged

Conversation

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@hmellor hmellor added the ready ONLY add when PR is ready to merge/full CI is needed label Feb 26, 2025
@mergify mergify bot added documentation Improvements or additions to documentation ci/build labels Feb 26, 2025
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Copy link

mergify bot commented Mar 4, 2025

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @hmellor.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@mergify mergify bot removed the needs-rebase label Mar 10, 2025
@DarkLight1337
Copy link
Member

Has this been resolved? We want to upgrade to v4.50 for Gemma3 release.

@hmellor
Copy link
Member Author

hmellor commented Mar 12, 2025

Not yet, we're expecting a release sometime next week

@aarnphm
Copy link
Collaborator

aarnphm commented Mar 21, 2025

Can we also bump lm-format-enforcer version for v0?

@hmellor
Copy link
Member Author

hmellor commented Mar 21, 2025

V1 and V0 do not have different requirements. lm-format-enforcer was last bumped 3 weeks ago, which version do you need?

hmellor added 2 commits March 21, 2025 14:43
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@hmellor hmellor changed the title Upgrade transformers to v4.49.0 Upgrade transformers to v4.50.0 Mar 21, 2025
@aarnphm
Copy link
Collaborator

aarnphm commented Mar 21, 2025

Oh, I thought it wasn't bumped. please discard that message my bad 😃

@DarkLight1337
Copy link
Member

As mentioned before, let's just skip those tests where the HF repo can't keep up with the latest transformers version

@hmellor
Copy link
Member Author

hmellor commented Mar 26, 2025

What is the preferred method of skipping?

  • Delete model from list of models?
  • Comment model and add reason?
  • Add pytest.mark.skip?

@DarkLight1337
Copy link
Member

DarkLight1337 commented Mar 26, 2025

Let's add a new field max_transformers_version to _HfExamplesInfo, and update check_transformers_version accordingly

(The tests for Isotr0py/deepseek-vl2-tiny and TIGER-Lab/Mantis-8B-siglip-llama3 are already being skipped in test_models.py, we can also update that code to use max_transformers_version)

@hmellor
Copy link
Member Author

hmellor commented Mar 26, 2025

Let's add a new field max_transformers_version to _HfExamplesInfo, and update check_transformers_version accordingly

Ok I can make this change and use it for models which are not currently compatible with vLLM because they're outdated.

(The tests for Isotr0py/deepseek-vl2-tiny and TIGER-Lab/Mantis-8B-siglip-llama3 are already being skipped in test_models.py, we can also update that code to use max_transformers_version)

For these ones it's only the HfRunner that has the maximum Transformers version requirement. These models do actually work with vLLM.

hmellor added 2 commits March 26, 2025 19:39
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@DarkLight1337
Copy link
Member

Let's add a new field max_transformers_version to _HfExamplesInfo, and update check_transformers_version accordingly

Ok I can make this change and use it for models which are not currently compatible with vLLM because they're outdated.

(The tests for Isotr0py/deepseek-vl2-tiny and TIGER-Lab/Mantis-8B-siglip-llama3 are already being skipped in test_models.py, we can also update that code to use max_transformers_version)

For these ones it's only the HfRunner that has the maximum Transformers version requirement. These models do actually work with vLLM.

Yeah they do work with vLLM. I think the other models would also work with vLLM if not for the outdated imports...

@hmellor
Copy link
Member Author

hmellor commented Mar 27, 2025

Yeah the outdated imports are the only blocker

hmellor added 3 commits March 27, 2025 11:06
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@hmellor
Copy link
Member Author

hmellor commented Mar 30, 2025

The two further issues I found have been added to the description. Fix PRs have already been merged in transformers, we'd just need another patch release.

@hmellor hmellor changed the title Upgrade transformers to v4.50.2 Upgrade transformers to v4.50.3 Mar 31, 2025
Copy link

mergify bot commented Mar 31, 2025

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @hmellor.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Mar 31, 2025
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@mergify mergify bot removed the needs-rebase label Mar 31, 2025
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@vllm-bot vllm-bot merged commit e5ef4fa into vllm-project:main Mar 31, 2025
61 of 64 checks passed
@hmellor hmellor deleted the update-transformers branch March 31, 2025 16:07
@ywang96 ywang96 mentioned this pull request Apr 1, 2025
Alex4210987 pushed a commit to LeiWang1999/vllm-bitblas that referenced this pull request Apr 5, 2025
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: xinyuxiao <xinyuxiao2024@gmail.com>
lulmer pushed a commit to lulmer/vllm that referenced this pull request Apr 7, 2025
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: Louis Ulmer <ulmerlouis@gmail.com>
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Apr 29, 2025
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
shreyankg pushed a commit to shreyankg/vllm that referenced this pull request May 3, 2025
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: Mu Huai <tianbowen.tbw@antgroup.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ci/build documentation Improvements or additions to documentation ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants