-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why official torch2trt doesn't invoke any plugin, but this invoke too many plugins even with simple model #7
Comments
Hi
official implement it by pooling with the stride and kernel_size, it is easy, but can not work on dynamic case. onnx use reduce ops to do the pooling, all value along given dims are gather together as one value. It is cool, but fails when I create this repo because sometimes we need to do downsampling with adaptivepool such as bfp
TRT did not have adaptivepool and I can not find a 'plan B' which does not need plugins. Most other plugins are created for similar reasons. Either TRT does not have an implementation or TRT can not do dynamic conversion. You can replace the Or simply use official one, I have change the name of this project yesterday night, You can install both of then without conflict(I did not tested, if something is wrong, please let me know). |
@grimoire I will have a test |
弱弱顺便问一下哈,what is w/o plugin? |
@ideafold without plugin |
I tested mobilenetv2.
In official repo, it can generate without any plugin, this repo will have.
I doesn't need dynamic in simple model, what's the differences between offical repo?
The text was updated successfully, but these errors were encountered: