-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix fp16 aten_native_batch_norm when bias is None and training is True #1217
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #1217 +/- ##
=======================================
Coverage 78.84% 78.84%
=======================================
Files 119 119
Lines 15690 15692 +2
Branches 2479 2479
=======================================
+ Hits 12371 12373 +2
Misses 2911 2911
Partials 408 408 ☔ View full report in Codecov by Sentry. |
Test Results 24 files ± 0 24 suites ±0 1h 43m 36s ⏱️ - 2m 40s For more details on these failures, see this check. Results for commit e6327ee. ± Comparison against base commit 4de6c80. ♻️ This comment has been updated with latest results. |
I saw some batch_norm failure in CI, not sure if they are new. Also why only failing on torch-nightly?
Cannot repro this on local with same ort,onnx,torch version. |
Looks like there is now assertion errors |
Fixes
nfnet
in https://github.com/microsoft/onnx-converters-private/issues/196Two changes:
weight
andbias
toinput
dtype.input
when computingmean
andvar
.No idea how this was not covered by unittest. The test case seems to be there.
Minimized repro: