Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

load libtorch2trt_dynamic.so error #11

Open
Chen-cyw opened this issue Nov 30, 2020 · 5 comments
Open

load libtorch2trt_dynamic.so error #11

Chen-cyw opened this issue Nov 30, 2020 · 5 comments

Comments

@Chen-cyw
Copy link

Chen-cyw commented Nov 30, 2020

I export the libtorch2trt_dynamic.so by "sudo python setup.py develop --plugins" and configure the trt and cuda path, but when I use the .so , it always get the error
"OSError: libtorch2trt_dynamic.so: undefined symbol: _ZNK6google8protobuf7Message11GetTypeNameEv"
did you try and finish the whole build&load process
the "libtorch2trt_dynamic.so" I export, its size just 500kb. Then I get the "libtorch2trt.so" by the same way in the project(torch2trt) ,its size is 7.2Mb,which can load correctly

@grimoire
Copy link
Owner

--plugins flag is not used in this repo. Please install follow the README of this repo, not the official one.

@Chen-cyw
Copy link
Author

--plugins flag is not used in this repo. Please install follow the README of this repo, not the official one.

I config the setup.py like this:
def initialize_plugins_options(cmd_obj):
cmd_obj.plugins = True
cmd_obj.cuda_dir = '/home/cyw/cuda-10.2'
cmd_obj.torch_dir = None
cmd_obj.trt_inc_dir = '/home/cyw/TensorRT-7.2.1.6/targets/x86_64-linux-gnu/include'
cmd_obj.trt_lib_dir = '/home/cyw/TensorRT-7.2.1.6/targets/x86_64-linux-gnu/lib'
any other things I need to do?

@grimoire
Copy link
Owner

Please follow these steps.
you do NOT need to change anything inside setup.py, The plugin used in this repo is amirstan_plugin.

@Chen-cyw
Copy link
Author

Please follow these steps.
you do NOT need to change anything inside setup.py, The plugin used in this repo is amirstan_plugin.
thanks!
I want to use the converted model by mmdetection-to-tensorrt in pure tensorrt env(later in tensorrt server), Is that I just need to load the "libamirstan_plugin.so" in tensortrt ? the torch2trt_dynamic is only for convert ,not for runtime?

@grimoire
Copy link
Owner

Yes, read this for detail.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants