Description
openedon Feb 6, 2024
pytest test/onnx/test_fx_op_consistency.py -k test_output_match_native_batch_norm
Traceback (most recent call last):
File "/home/liqun/anaconda3/envs/pytorch/lib/python3.10/unittest/case.py", line 59, in testPartExecutor
yield
File "/home/liqun/anaconda3/envs/pytorch/lib/python3.10/unittest/case.py", line 498, in subTest
yield
File "/home/liqun/LiqunWA/pytorch/test/onnx/test_fx_op_consistency.py", line 1861, in _run_test_output_match
_compare_onnx_and_torch_exported_program(
File "/home/liqun/LiqunWA/pytorch/test/onnx/test_fx_op_consistency.py", line 1755, in _compare_onnx_and_torch_exported_program
onnx_outputs = onnx_exported_program(*input_args, **input_kwargs)
File "/home/liqun/LiqunWA/pytorch/torch/onnx/_internal/exporter.py", line 719, in call
ort_session = onnxruntime.InferenceSession(onnx_model, providers=providers)
File "/home/liqun/anaconda3/envs/pytorch/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/home/liqun/anaconda3/envs/pytorch/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 474, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Type Error: Type (tensor(float16)) of output arg (_native_batch_norm_legit_1) of node (_aten_native_batch_norm_training_onnx_16) does not match expected type (tensor(float)).