-
Notifications
You must be signed in to change notification settings - Fork 153
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support dynamic batch inference with onnx/onnxruntime #45
Comments
Thanks. I'll run some experiments and see how it goes. |
I followed the ONNX deployment walkthrough and run
This error occcurs independently of the |
Hi @timmh , The default example only supports ONNX models with preprocessing, could you please open a new ticket about inferencing without preprocessing for easier to track this issue? |
🚀 Feature
Support dynamic batch inference with
onnx/onnxruntime
.Motivation
As @makaveli10 pointed out in #39 (comment), the current implementation of
onnx/onnxruntime
mechanism only supports dynamic shapes inference, not dynamic batch size.I didn't know how to implement the dynamic batch inference, any help is welcome here.
The text was updated successfully, but these errors were encountered: