You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
model.onnx is written, but I'm getting the following two warnings in the output:
WARNING - transpose_input for input_1: shape must be rank 4, ignored
WARNING - transpose_output for activation: shape must be rank 4, ignored
NCHW arguments indeed don't seem to have any effect. When I run the model in PyTorch on a (1, 1, 100, 100, 100) input, the output shape is (1, 100, 100, 100, 1). It's unclear to me why the input is accepted in NCHW format despite the warnings above.
What I expect is the converted model to take (batch_size, 1, 100, 100, 100) inputs and output (batch_size, 1, 100, 100, 100) tensors.
System information
OS Platform and Distribution (e.g., Linux Ubuntu 18.04*): Ubuntu 20.04
TensorFlow Version: 2.13.1
Python version: 3.11.9
ONNX version (if applicable, e.g. 1.11*): 1.16.2
ONNXRuntime version (if applicable, e.g. 1.11*):
To reproduce
I am new to tf2onnx so I would first like to establish if this is the intended behaviour. If not, I'll try to create a MWE.
The text was updated successfully, but these errors were encountered:
Describe the bug
I am trying to convert a TF-trained 3DUNet. Below is the head and the tail of the model's summary:
Conversion command:
model.onnx
is written, but I'm getting the following two warnings in the output:NCHW arguments indeed don't seem to have any effect. When I run the model in PyTorch on a
(1, 1, 100, 100, 100)
input, the output shape is(1, 100, 100, 100, 1)
. It's unclear to me why the input is accepted in NCHW format despite the warnings above.What I expect is the converted model to take
(batch_size, 1, 100, 100, 100)
inputs and output(batch_size, 1, 100, 100, 100)
tensors.System information
To reproduce
I am new to tf2onnx so I would first like to establish if this is the intended behaviour. If not, I'll try to create a MWE.
The text was updated successfully, but these errors were encountered: