Skip to content

Fix release full#45029

Merged
ArthurZucker merged 33 commits intomainfrom
fix-release-full
Mar 27, 2026
Merged

Fix release full#45029
ArthurZucker merged 33 commits intomainfrom
fix-release-full

Conversation

@ArthurZucker
Copy link
Copy Markdown
Collaborator

What does this PR do?

Release workflow is failing

@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

- `is_torchvision_available`, `is_timm_available`, `is_torchaudio_available`,
  `is_torchao_available`, `is_accelerate_available` now return False when
  torch is not installed, since all these packages require torch
- Add `@requires(backends=("torch",))` to `PI0Processor` (was missing,
  causing the lazy module to crash on import without torch)
- Fix wrong availability guards: `is_vision_available` → `is_torchvision_available`
  in pixtral processor, `is_torch_available` in smolvlm processor
- Wrap bare `import torch` / torchvision imports in `processing_sam3_video.py`
- Quote `torch.Tensor` in return type annotation of `tokenization_mistral_common.py`
- Wrap 66 `image_processing_pil_*.py` imports from torch-dependent counterparts
  in try/except with ImagesKwargs fallbacks; quote `torch.Tensor` annotations
- Restore explicit `from transformers import *` check in CircleCI
  `check_repository_consistency` job

Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>
@ArthurZucker
Copy link
Copy Markdown
Collaborator Author

@vasqu could you review this? It needs to be fixed before we release — from transformers import * was broken without torch installed.

Comment on lines +1201 to +1202
if not is_torch_available():
return False
Copy link
Copy Markdown
Contributor

@vasqu vasqu Mar 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this an infinite loop? :D Edit: no sorry, its ao

ArthurZucker and others added 21 commits March 26, 2026 23:38
The PIL image processor changes are too fragile (break on make fix-repo).
Keep only the core fixes:
- is_torchvision/timm/torchaudio/torchao/accelerate_available() check torch
- CircleCI explicit import check
- tokenization_mistral_common.py torch.Tensor annotation
- processing_sam3_video.py conditional torch imports
- processing_pixtral.py/processing_smolvlm.py availability guard fixes
- PI0Processor @requires decorator

Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>
@github-actions
Copy link
Copy Markdown
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: aria, beit, bridgetower, conditional_detr, convnext, deepseek_vl, deepseek_vl_hybrid, deformable_detr, detr, donut, dpt, efficientloftr, efficientnet, eomt, ernie4_5_vl_moe, flava

"torch",
"torchvision",
),
lambda name, content: name.startswith("image_processing_") and "pil" in name: ("vision",),
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

name.endswith("_fast"): now default don't

@@ -345,6 +345,7 @@
name for name in dir(dummy_torchvision_objects) if not name.startswith("_")
]
else:
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

torch vision depends on torch

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

second most important

Comment on lines +2530 to +2537
lambda name, content: "tokenization_" in name and name.endswith("_fast"): ("tokenizers",),
lambda name, content: "image_processing_" in name and "TorchvisionBackend" in content: (
"vision",
"torch",
"torchvision",
),
lambda name, content: name.startswith("image_processing_") and "TorchvisionBackend" in content: (
"vision",
"torch",
"torchvision",
),
lambda name, content: name.startswith("image_processing_"): ("vision",),
lambda name, content: name.startswith("video_processing_"): ("vision", "torch", "torchvision"),
lambda name, content: "image_processing_" in name: ("vision",),
lambda name, content: "video_processing_" in name: ("vision", "torch", "torchvision"),
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

most important changes @LysandreJik this is why it was failing, name is full path

@ArthurZucker ArthurZucker marked this pull request as ready for review March 26, 2026 23:47
Comment on lines 177 to 198
- run:
name: "Test import with all backends (torch + PIL + torchvision)"
command: python -c "from transformers import *" || (echo '🚨 import failed with all backends. Fix unprotected imports!! 🚨'; exit 1)
- run:
name: "Test import with torch only (no PIL, no torchvision)"
command: |
uv pip uninstall Pillow torchvision -q
python -c "from transformers import *" || (echo '🚨 import failed with torch only (no PIL). Fix unprotected imports!! 🚨'; exit 1)
uv pip install -e ".[quality]" -q
- run:
name: "Test import with PIL only (no torch, no torchvision)"
command: |
uv pip uninstall torch torchvision torchaudio -q
python -c "from transformers import *" || (echo '🚨 import failed with PIL only (no torch). Fix unprotected imports!! 🚨'; exit 1)
uv pip install -e ".[quality]" -q
- run:
name: "Test import with torch + PIL, no torchvision"
command: |
uv pip uninstall torchvision -q
python -c "from transformers import *" || (echo '🚨 import failed with torch+PIL but no torchvision. Fix unprotected imports!! 🚨'; exit 1)
uv pip install -e ".[quality]" -q
- run: make check-repository-consistency
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cc @tarekziade and @ydshieh we are missing a bunch of these to enforce our import * is true WRT to our hards deps only

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good catch, yeah the original check (moved to check-repository-consistency) only checked if transfomers is importable - we'll extend it

@ArthurZucker ArthurZucker merged commit 97b7727 into main Mar 27, 2026
28 of 30 checks passed
@ArthurZucker ArthurZucker deleted the fix-release-full branch March 27, 2026 00:11
Lidang-Jiang added a commit to Lidang-Jiang/transformers that referenced this pull request Mar 27, 2026
…age processors

PR huggingface#45029 added @requires(backends=("vision", "torch", "torchvision")) to 67
PIL backend image_processing_pil_*.py files. This causes PIL backend classes
to become dummy objects when torchvision is not installed, making
AutoImageProcessor unable to find any working processor.

Fix: set @requires to ("vision",) for files that only need PIL, and
("vision", "torch") for files that also use torch directly. Also fix
5 modular source files so make fix-repo preserves the correct backends.

Fixes huggingface#45042

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
zucchini-nlp pushed a commit to zucchini-nlp/transformers that referenced this pull request Mar 27, 2026
* first part of the fix

* fix torch imports

* revert

* fix: make from transformers import * work without torch

- `is_torchvision_available`, `is_timm_available`, `is_torchaudio_available`,
  `is_torchao_available`, `is_accelerate_available` now return False when
  torch is not installed, since all these packages require torch
- Add `@requires(backends=("torch",))` to `PI0Processor` (was missing,
  causing the lazy module to crash on import without torch)
- Fix wrong availability guards: `is_vision_available` → `is_torchvision_available`
  in pixtral processor, `is_torch_available` in smolvlm processor
- Wrap bare `import torch` / torchvision imports in `processing_sam3_video.py`
- Quote `torch.Tensor` in return type annotation of `tokenization_mistral_common.py`
- Wrap 66 `image_processing_pil_*.py` imports from torch-dependent counterparts
  in try/except with ImagesKwargs fallbacks; quote `torch.Tensor` annotations
- Restore explicit `from transformers import *` check in CircleCI
  `check_repository_consistency` job

Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>

* up

* style of this

* revert: remove src/models changes, keep only core import fixes

The PIL image processor changes are too fragile (break on make fix-repo).
Keep only the core fixes:
- is_torchvision/timm/torchaudio/torchao/accelerate_available() check torch
- CircleCI explicit import check
- tokenization_mistral_common.py torch.Tensor annotation
- processing_sam3_video.py conditional torch imports
- processing_pixtral.py/processing_smolvlm.py availability guard fixes
- PI0Processor @requires decorator

Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>

* nit

* the mega quidproquo

* use rquires(backend

* more pil fixes

* fixes

* temp update

* up?

* is this it?

* style?

* revert a bunch of ai shit

* pi0 requires this

* revert some stuffs

* upd

* the fix

* yups

* ah

* up

* up

* fix

* yes?

* update

* up

* nits

* up

* up

* order

---------

Co-authored-by: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>
ArthurZucker added a commit that referenced this pull request Mar 30, 2026
…age processors (#45045)

* [Bugfix] Remove incorrect torchvision requirement from PIL backend image processors

PR #45029 added @requires(backends=("vision", "torch", "torchvision")) to 67
PIL backend image_processing_pil_*.py files. This causes PIL backend classes
to become dummy objects when torchvision is not installed, making
AutoImageProcessor unable to find any working processor.

Fix: set @requires to ("vision",) for files that only need PIL, and
("vision", "torch") for files that also use torch directly. Also fix
5 modular source files so make fix-repo preserves the correct backends.

Fixes #45042

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* [Bugfix] Remove redundant @requires(backends=("vision",)) from PIL backends

Per reviewer feedback: the vision-only @requires decorator is redundant
for PIL backend classes since PilBackend base class already handles this.

- Remove @requires(backends=("vision",)) from 43 PIL backend files
- Remove unused `requires` import from 38 files (Category A)
- Keep @requires(backends=("vision", "torch")) on method-level decorators (Category B: 5 files)

* update

* remove torch when its not necessary

* remove if typechecking

* fix  import shinanigans

* marvellous that's how we protect torch :)

* beit is torchvisionbackend

* more import cleanup

* fiixup

* fix-repo

* update

* style

* fixes

* up

* more

* fix repo

* up

* update

* fix imports

* style

* fix check copies

* arf

* converter up

* fix?

* fix copies

* fix for func

* style

* ignore

* type

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Arthur <arthur.zucker@gmail.com>
NielsRogge pushed a commit to NielsRogge/transformers that referenced this pull request Mar 30, 2026
* first part of the fix

* fix torch imports

* revert

* fix: make from transformers import * work without torch

- `is_torchvision_available`, `is_timm_available`, `is_torchaudio_available`,
  `is_torchao_available`, `is_accelerate_available` now return False when
  torch is not installed, since all these packages require torch
- Add `@requires(backends=("torch",))` to `PI0Processor` (was missing,
  causing the lazy module to crash on import without torch)
- Fix wrong availability guards: `is_vision_available` → `is_torchvision_available`
  in pixtral processor, `is_torch_available` in smolvlm processor
- Wrap bare `import torch` / torchvision imports in `processing_sam3_video.py`
- Quote `torch.Tensor` in return type annotation of `tokenization_mistral_common.py`
- Wrap 66 `image_processing_pil_*.py` imports from torch-dependent counterparts
  in try/except with ImagesKwargs fallbacks; quote `torch.Tensor` annotations
- Restore explicit `from transformers import *` check in CircleCI
  `check_repository_consistency` job

Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>

* up

* style of this

* revert: remove src/models changes, keep only core import fixes

The PIL image processor changes are too fragile (break on make fix-repo).
Keep only the core fixes:
- is_torchvision/timm/torchaudio/torchao/accelerate_available() check torch
- CircleCI explicit import check
- tokenization_mistral_common.py torch.Tensor annotation
- processing_sam3_video.py conditional torch imports
- processing_pixtral.py/processing_smolvlm.py availability guard fixes
- PI0Processor @requires decorator

Co-Authored-By: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>

* nit

* the mega quidproquo

* use rquires(backend

* more pil fixes

* fixes

* temp update

* up?

* is this it?

* style?

* revert a bunch of ai shit

* pi0 requires this

* revert some stuffs

* upd

* the fix

* yups

* ah

* up

* up

* fix

* yes?

* update

* up

* nits

* up

* up

* order

---------

Co-authored-by: Claude Sonnet 4.6 (1M context) <noreply@anthropic.com>
NielsRogge pushed a commit to NielsRogge/transformers that referenced this pull request Mar 30, 2026
…age processors (huggingface#45045)

* [Bugfix] Remove incorrect torchvision requirement from PIL backend image processors

PR huggingface#45029 added @requires(backends=("vision", "torch", "torchvision")) to 67
PIL backend image_processing_pil_*.py files. This causes PIL backend classes
to become dummy objects when torchvision is not installed, making
AutoImageProcessor unable to find any working processor.

Fix: set @requires to ("vision",) for files that only need PIL, and
("vision", "torch") for files that also use torch directly. Also fix
5 modular source files so make fix-repo preserves the correct backends.

Fixes huggingface#45042

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* [Bugfix] Remove redundant @requires(backends=("vision",)) from PIL backends

Per reviewer feedback: the vision-only @requires decorator is redundant
for PIL backend classes since PilBackend base class already handles this.

- Remove @requires(backends=("vision",)) from 43 PIL backend files
- Remove unused `requires` import from 38 files (Category A)
- Keep @requires(backends=("vision", "torch")) on method-level decorators (Category B: 5 files)

* update

* remove torch when its not necessary

* remove if typechecking

* fix  import shinanigans

* marvellous that's how we protect torch :)

* beit is torchvisionbackend

* more import cleanup

* fiixup

* fix-repo

* update

* style

* fixes

* up

* more

* fix repo

* up

* update

* fix imports

* style

* fix check copies

* arf

* converter up

* fix?

* fix copies

* fix for func

* style

* ignore

* type

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Arthur <arthur.zucker@gmail.com>
SangbumChoi pushed a commit to SangbumChoi/transformers that referenced this pull request Apr 4, 2026
…age processors (huggingface#45045)

* [Bugfix] Remove incorrect torchvision requirement from PIL backend image processors

PR huggingface#45029 added @requires(backends=("vision", "torch", "torchvision")) to 67
PIL backend image_processing_pil_*.py files. This causes PIL backend classes
to become dummy objects when torchvision is not installed, making
AutoImageProcessor unable to find any working processor.

Fix: set @requires to ("vision",) for files that only need PIL, and
("vision", "torch") for files that also use torch directly. Also fix
5 modular source files so make fix-repo preserves the correct backends.

Fixes huggingface#45042

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* [Bugfix] Remove redundant @requires(backends=("vision",)) from PIL backends

Per reviewer feedback: the vision-only @requires decorator is redundant
for PIL backend classes since PilBackend base class already handles this.

- Remove @requires(backends=("vision",)) from 43 PIL backend files
- Remove unused `requires` import from 38 files (Category A)
- Keep @requires(backends=("vision", "torch")) on method-level decorators (Category B: 5 files)

* update

* remove torch when its not necessary

* remove if typechecking

* fix  import shinanigans

* marvellous that's how we protect torch :)

* beit is torchvisionbackend

* more import cleanup

* fiixup

* fix-repo

* update

* style

* fixes

* up

* more

* fix repo

* up

* update

* fix imports

* style

* fix check copies

* arf

* converter up

* fix?

* fix copies

* fix for func

* style

* ignore

* type

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Arthur <arthur.zucker@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants