Fix GPT2 attention scaling ignored in SDPA/FlashAttention #44397
CircleCI Checks / setup_and_quality
succeeded
Mar 4, 2026 in 5m 37s
Workflow: setup_and_quality
- fetch_tests - Success
- check_circleci_user - Success
- check_repository_consistency - Success
- check_code_quality - Success
Loading