Closed
Description
openedon May 22, 2019
Hi all, I created a model containing "Where" operator but failed to load it via Onnx Runtime C API:
Could not find an implementation for the node Where(9)
However, the version matrix says it supports Onnx opset version 10.
The Python script to generate the model:
import onnx
onnx_input_str = onnx.helper.make_tensor_value_info("INPUT0", onnx.TensorProto.STRING, [1])
onnx_input_int = onnx.helper.make_tensor_value_info("INPUT1", onnx.TensorProto.INT32, [1])
onnx_flag = onnx.helper.make_tensor_value_info("READY", onnx.TensorProto.INT32, [1])
onnx_output = onnx.helper.make_tensor_value_info("OUTPUT", onnx.TensorProto.STRING, [1])
internal_input = onnx.helper.make_node("Cast", ["INPUT0"], ["_INPUT"], to=onnx.TensorProto.INT32)
add = onnx.helper.make_node("Add", ["_INPUT", "INPUT1"], ["add"])
zeros = onnx.helper.make_node("Sub", ["READY", "READY"], ["zeros"])
equal = onnx.helper.make_node("Equal", ["READY", "zeros"], ["equal"])
where = onnx.helper.make_node("Where", ["equal", "zeros", "add"], ["CAST"])
cast = onnx.helper.make_node("Cast", ["CAST"], ["OUTPUT"], to=onnx.TensorProto.STRING)
onnx_nodes = [internal_input, add, zeros, equal, where, cast]
onnx_inputs = [onnx_input_str, onnx_input_int, onnx_flag]
onnx_outputs = [onnx_output]
graph_proto = onnx.helper.make_graph(onnx_nodes, "conditional_and_cast", onnx_inputs, onnx_outputs)
model_def = onnx.helper.make_model(graph_proto)
onnx.save(model_def, "model.onnx")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Metadata
Assignees
Labels
No labels