Skip to content

Conversation

@michaelgsharp
Copy link
Contributor

Changed ONNX export from always exporting the dimensions as 1 to -1. This lets ONNX Runtime determine the dimension when the data is passed to it, allowing for batching to be done if desired. ML.NET doesn't support batching, but this allows the model to be run directly in ORT using batching while still supporting the streaming approach that ML.NET uses.

@michaelgsharp michaelgsharp requested review from a team, codemzs, ganik and harishsk February 5, 2020 01:01
@michaelgsharp michaelgsharp self-assigned this Feb 5, 2020
@michaelgsharp michaelgsharp merged commit eed6456 into dotnet:master Feb 6, 2020
@michaelgsharp michaelgsharp deleted the onnx-export-allow-batch branch February 6, 2020 18:57
@ghost ghost locked as resolved and limited conversation to collaborators Mar 19, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants