Skip to content

Bump transformers from 4.50.3 to 4.52.4 #2160

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 1, 2025

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Jun 1, 2025

Bumps transformers from 4.50.3 to 4.52.4.

Release notes

Sourced from transformers's releases.

Patch release: v4.52.4

The following commits are included in that patch release:

  • [qwen-vl] Look for vocab size in text config (#38372)
  • Fix convert to original state dict for VLMs (#38385)
  • [video utils] group and reorder by number of frames (#38374)
  • [paligemma] fix processor with suffix (#38365)
  • Protect get_default_device for torch<2.3 (#38376)
  • [OPT] Fix attention scaling (#38290)

Patch release v4.52.3

We had to protect the imports again, a series of bad events. Here are the two prs for the patch:

Patch release v4.52.2

We had to revert #37877 because of a missing flag that was overriding the device map. We re-introduced the changes because they allow native 3D parallel training in Transformers. Sorry everyone for the troubles! 🤗

Patch release v4.51.3

A mix of bugs were fixed in this patch; very exceptionally, we diverge from semantic versioning to merge GLM-4 in this patch release.

  • Handle torch ver in flexattn (#37400)
  • handle torch version edge cases (#37399)
  • Add glm4 (#37388)

Patch Release 4.51.2

This is another round of bug fixes, but they are a lot more minor and outputs were not really affected!

Patch release v4.51.1

Since the release of Llama 4, we have fixed a few issues that we are now releasing in patch v4.51.1

  • Fixing flex attention for torch=2.6.0 (#37285)
  • more fixes for post-training llama4 (#37329)
  • Remove HQQ from caching allocator warmup (#37347)
  • fix derived berts _init_weights (#37341)
  • Fix init empty weights without accelerate (#37337)
  • Fix deepspeed with quantization (#37324)

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [transformers](https://github.com/huggingface/transformers) from 4.50.3 to 4.52.4.
- [Release notes](https://github.com/huggingface/transformers/releases)
- [Commits](huggingface/transformers@v4.50.3...v4.52.4)

---
updated-dependencies:
- dependency-name: transformers
  dependency-version: 4.52.4
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot @github
Copy link
Contributor Author

dependabot bot commented on behalf of github Jun 1, 2025

The reviewers field in the dependabot.yml file will be removed soon. Please use the code owners file to specify reviewers for Dependabot PRs. For more information, see this blog post.

@dependabot dependabot bot requested a review from Borda June 1, 2025 02:04
@dependabot dependabot bot added dependencies python Pull requests that update Python code labels Jun 1, 2025
@dependabot dependabot bot requested review from mruberry, lantiga and t-vi as code owners June 1, 2025 02:04
@dependabot dependabot bot added dependencies python Pull requests that update Python code labels Jun 1, 2025
@t-vi t-vi enabled auto-merge (squash) June 1, 2025 09:01
Copy link
Collaborator

@t-vi t-vi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it works...

@t-vi t-vi merged commit bce01d5 into main Jun 1, 2025
55 checks passed
@t-vi t-vi deleted the dependabot-pip-transformers-4.52.4 branch June 1, 2025 09:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant