Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

c++ inference #4

Open
lucasjinreal opened this issue Sep 29, 2020 · 7 comments
Open

c++ inference #4

lucasjinreal opened this issue Sep 29, 2020 · 7 comments

Comments

@lucasjinreal
Copy link

Hi, I have converted engine file, when I try inference the genereated trt engine file, there was an error:

image

Do u know what's the best way to found these plugins? I do have your amir plugins installed, but I don't know how to use it in c++, simply link it?

@grimoire
Copy link
Owner

Hi,
Link the lib should be enough, if it doesn't work, invoke initLibAmirstanInferPlugins() in amirInferPlugin.h to load plugins manually.
read this for detail.

@lucasjinreal
Copy link
Author

@grimoire Wonna consult another question. For those model doesn't need plugin at all, such as resnet50 which is totally normal model, is that possible to generate engine without the plugins? I mean, for users to use, I want the engine attach plugins only if they really need one, for those normal model, users can directly using normal tensorrt lib to inference.

@grimoire
Copy link
Owner

Errr...Actually, tensorrt did not have GAP implementation. That's why resnet need plugins.
here is the layers tensorrt provided https://docs.nvidia.com/deeplearning/tensorrt/api/python_api/infer/Graph/Layers.html#ipoolinglayer

You can use normal average pooling(if you don't need dynamic shape) or reduce instead of GAP.

read init.py, here is the layers that need plugins, avoid use them if you don't want to use plugins.


try:
    # custom plugin support
    from .GroupNorm import *
    from .repeat import *
    from .expand import *
    from .gather import *
    from .adaptive_avg_pool2d import *
    from .adaptive_max_pool2d import *
    from .AdaptiveAvgPool2d import *
    from .AdaptiveMaxPool2d import *
    from .meshgrid import convert_meshgrid
    from .grid_sample import convert_grid_sample
    from .flip import convert_flip
    from .cummax import convert_cummax
    from .cummin import convert_cummin
    from .cumsum import convert_cumsum
    from .cumprod import convert_cumprod
    from .expand_as import convert_expand_as
    from .deform_conv2d import convert_deform_conv2d
    from .nms import convert_nms
    from .roi_align import convert_roi_align, convert_RoiAlign
    from .roi_pool import convert_roi_pool, convert_RoIPool
except:
    print("plugin not found.")

@lucasjinreal
Copy link
Author

@grimoire Did u mean, if the model doesn't have a plugin, torch2trt will not generate engine with plugin? AFAIK, using onnx2trt convert resnet doesn't invoke any plugin in it's engine.

@grimoire
Copy link
Owner

grimoire commented Oct 1, 2020

In this issue:

If you use TensorRT 7.*, the onnx operator GlobalAveragePool is implemented using IReduceLayer

IReduceLayer can do reduce along given dims, That's why onnx -> tensorrt doesn't need plugins.

But this repo also need to support mmdetection-to-tensorrt, Which need AdaptivePooling to do downsample(poolsize != 1), IReduceLayer can not help me with this task.
And reduce the plugin usage is a good point. I will add a special case in GAP when poolsize=1 in future.

@lucasjinreal
Copy link
Author

@grimoire I am not specific for resnet here, just take it as example. I want to know, it's that possible to build an engine with this tool without engine invoke plugins if this model doesn't need one (take vgg as example).

@grimoire
Copy link
Owner

grimoire commented Oct 2, 2020

If the model does not use the layers I mentioned above, the anwser is yes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants