-
Notifications
You must be signed in to change notification settings - Fork 252
Open
Description
Summary
Transformers v5.0.0 was released 3 hours ago and causes GLiNER to return uniform low confidence scores (~0.05) for all entity types, effectively breaking entity extraction.
Environment
- GLiNER: 0.2.23
- Transformers: 5.0.0 (broken) / 4.57.3 (working)
- Platform: Jetson AGX Orin (ARM64) with CUDA 12.6
- Base image: dustynv/pytorch:2.6-r36.4.0-cu128
- Model: EmergentMethods/gliner_large_news-v2.1
Symptoms
- Model loads successfully without errors
predict_entities()returns entities but with near-identical low scores- All tokens receive ~0.05 confidence regardless of actual entity type
- The model always selects the last label in the label list
Debug output example:
Text: "Donald Trump met with Nicolas Maduro in Cuba"
Labels: ["person", "organization", "location"]
Result with transformers 5.0.0:
- "Cuba" -> location (0.051)
- "Trump" -> location (0.049) # Wrong! Should be person
- "Maduro" -> location (0.048) # Wrong! Should be person
Result with transformers 4.57.3:
- "Donald Trump" -> person (0.86) ✓
- "Nicolas Maduro" -> person (0.86) ✓
- "Cuba" -> location (0.90) ✓
Root Cause
Transformers v5.0.0 includes a breaking change to default dtype handling:
The library now loads models in their original saved dtype rather than forcing float32.
This likely affects GLiNER's span scoring mechanism, causing the uniform low scores.
Workaround
Pin transformers to v4.x in your dependencies:
transformers>=4.57.3,<5.0.0
Dockerfile example:
RUN pip3 install --no-cache-dir \
"transformers>=4.57.3,<5.0.0" \
gliner>=0.2.23Suggestion
Consider:
- Adding an upper bound to transformers in
setup.py/pyproject.tomluntil compatibility is confirmed - Testing against transformers v5.0.0 to identify the specific breaking change
- Adding a runtime warning if transformers >= 5.0.0 is detected
Related
- Transformers v5.0.0 Release Notes
- Possibly related to #306 which addressed earlier dtype issues
Happy to provide more debug info if helpful. Thanks for the great library! 🙏
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels