Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is Amirstan_plugin now compulsory? #39

Open
albertodallolio opened this issue Oct 28, 2024 · 3 comments
Open

Is Amirstan_plugin now compulsory? #39

albertodallolio opened this issue Oct 28, 2024 · 3 comments

Comments

@albertodallolio
Copy link

Dear grimoire, thanks a lot for your repo.

I am planning to update my tensorrt and torch versions. For this I need to get the latest version of this repo. However I noticed that if I install torch2trt_dynamic from HEAD after #33 I get the following issue:

OSError: /opt/conda/lib/python3.11/site-packages/torch2trt_dynamic/plugins/libamirstan_plugin.so: cannot open shared object file: No such file or directory

Is Amirstan_plugin now not optional anymore? If so, would it be possible to make optional again?
I was using this repo from #25 and I could successfully export trt models without amirstan plugin.

Thanks in advance for your help.

@grimoire
Copy link
Owner

The plugin is loaded here:

def load_plugin_library():
ctypes.CDLL(osp.join(dir_path, 'libamirstan_plugin.so'))

You can wrap it in a try-except block to ignore the loading.

@albertodallolio
Copy link
Author

Hey @grimoire , thanks a lot for the suggestion.

That solves this issue and I also needed to modify these lines to work properly with tensorrt 10.5.0. The solution is to use INTERPOLATION_MODE instead of ResizeMode. (I can open a PR if you want to address this)

However I am now getting a separate error:

Traceback (most recent call last):
  File "/my_code/trt_generation/torch_2_trt_fmaps.py", line 109, in <module>
    model_trt = torch2trt_dynamic(model, [images_iter_1[0][0], images_iter_1[0][1]], fp16_mode=True, opt_shape_param=opt_shape_param)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/torch2trt_dynamic/torch2trt_dynamic.py", line 641, in torch2trt_dynamic
    return module2trt(module, args=inputs, config=config, log_level=log_level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/torch2trt_dynamic/torch2trt_dynamic.py", line 599, in module2trt
    return func2trt(
           ^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/torch2trt_dynamic/torch2trt_dynamic.py", line 587, in func2trt
    engine, module_meta = build_engine(func, inputs, config, log_level)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/torch2trt_dynamic/torch2trt_dynamic.py", line 556, in build_engine
    network, module_meta = build_network(builder, func, inputs, config=config)
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/torch2trt_dynamic/torch2trt_dynamic.py", line 486, in build_network
    outputs = func(**inputs)
              ^^^^^^^^^^^^^^
  File "/my_code/cres_nets/extractor.py", line 164, in forward
    fmap1_dw8 = F.avg_pool2d(fmap1, 2, stride=2)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/torch2trt_dynamic/torch2trt_dynamic.py", line 313, in wrapper
    converter['converter'](ctx)
  File "/opt/conda/lib/python3.11/site-packages/torch2trt_dynamic/converters/avg_pool2d.py", line 19, in convert_avg_pool2d
    input_trt = trt_(ctx.network, input)
                ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/conda/lib/python3.11/site-packages/torch2trt_dynamic/torch2trt_dynamic.py", line 132, in trt_
    num_dim = len(t._trt.shape)
              ^^^^^^^^^^^^^^^^^
ValueError: __len__() should return >= 0

and I am fairly sure it is coming from this step:

fmap1 = x[:x.shape[0]//2]
fmap2 = x[x.shape[0]//2:]
return [x1, x2]

I have tried also using torch.split or torch.chunk as suggested in #21 but I still get the same error. I added a print right before the error num_dim = len(t._trt.shape) and I noticed that the tensor shape after the split change drastically:

# Before
(2, 256, -1, -1)
# After
(81)

Any idea?

Thanks in advance for your help.

@grimoire
Copy link
Owner

I can open a PR if you want to address this

Sure! that would be cool.

I do not have much idea about the error. Could you provide me with a simple reproduction code?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants