Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relay][Frontend][Pytorch] Prelu definition mismatch in pytorch #8184

Closed
YuhengHuang42 opened this issue Jun 3, 2021 · 1 comment · Fixed by #8192
Closed

[Relay][Frontend][Pytorch] Prelu definition mismatch in pytorch #8184

YuhengHuang42 opened this issue Jun 3, 2021 · 1 comment · Fixed by #8192

Comments

@YuhengHuang42
Copy link
Contributor

Description

A similar bug was found in ONNX model, and fixed by this PR: #7208

However, for Pytorch importation, the bug still exists.

By definition: PRelu definition in pytorch, num_parameters can be set at 1 or the number of channels at input.

However, currently PReLU in TVM seems to support num_parameters = number of channels only. So there will be an error if you set num_parameters = 1 while give input channels > 1.

And please notice that by default Pytorch set the num_parameters = 1.

(Currently, the workaround is to turn Pytorch model to ONNX model first, and then import the ONNX model.)

Code to reproduce

import torch
import tvm 
from tvm import relay

minimal_example = torch.nn.Sequential(
    torch.nn.PReLU(num_parameters=1)
)
minimal_example.eval()

input_shape = (1, 6, 10, 10) 
random_input = torch.randn(input_shape)
trace = torch.jit.trace(minimal_example, random_input)
input_info = [("input0", input_shape)]
mod, params = tvm.relay.frontend.from_pytorch(trace, input_info)

Environment

TVM version: 0.8.dev0 at cc3d60e

Pytorch version: 1.8.1

OS version: macOS 10.15.7

@masahi
Copy link
Member

masahi commented Jun 3, 2021

thanks, it would be great if you can send a PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants