-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding aten::unsqueeze_ to PT Frontend #7231
Conversation
This is probably not how we should support Later I found a promising way to support inplace copy/assignment: We first convert Relay tvm/python/tvm/relay/frontend/pytorch.py Line 2643 in d1399f3
@t-vi Does this sound a reasonable way to support inplace assignment? |
@masahi can you elaborate a bit how you want to do that? So to my mind there are two parts to this. For slice assignments @torch.jit.script
def foo(x):
x[:5] = 0
return 2 * x we get
Note how So we need three things:
Note that while for this simple case, using the pattern
would be possible, but @torch.jit.script
def foo(x):
y = x[:2]
y[:5] = 0
return 2 * x will mess that up for you. P.S.: While you mention working on the TorchScript , I have a concrete plan (see my blog) to support more graph operations in TorchScript from python, so if you need something in particular, we might try to get that. |
For the first case,
Using the two passes I mentioned, I get this graph:
i.e. It seems But right, your second example results in
because |
Maybe they tried to take the same matching shortcut internally... |
I experimented with this approach to support inplace update in torchvision faster rcnn / mask rcnn: This code path will not hit during tracing mode, so it is not an issue for us. But I tried anyway to experiment if I thought it might be promissing, but indeed there could be many limitation / corner cases ... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@codeislife99 As discussed, we still think adding half-baked conversion of copy_
is not a good idea. It might work in some cases, while it could generate a graph that is not consistent with Torch model. Please remove it.
I don't want to ruin the party, but does @torch.jit.script
def foo(x):
y = x.unsqueeze_(0)
return x As far as I know, we're only supporting "strided" tensors (i.e. blob of memory + sizes + strides define the tensor), so tracking views across these should be doable (one could, but doesn't have to) see the signature annotations in ATen's native_functions.yaml to see which of the ops we have need to be handled. One of the tricky ones would be reshape which is sometimes just a view and at other times (when viewing is impossible, eg The other option could be to do this on the PyTorch side, improving the pass that @masahi highlighted. |
Thanks for the detailed discussion and uncovering some things which I didn't know about. I will remove |
I'm also not sure if inplace unsqueeze can be supported without any concern, or more generally how well we are doing with other inplace ops. @codeislife99 Does this change work for your real model (passing your contrived test is not enough) ? |
@masahi , @codeislife99 : I mentioned this to one of the other PyTorch devs and he mentioned that there is a remove mutation pass in PyTorch that should take care of the cases where the PyTorch alias analysis can prove it is safe to remove it: |
Interesting, if that can convert |
I am still waiting for the full scripted model from my customer. I will try the above remedy once that is ready and then I will make changes to this PR. |
I have removed copy_ from this PR and added a few ops after unsqueeze in the test to make sure it works. |
The inplace op recommendation using |
Later maybe we should look into more detail why remove mutation pass is not working. |
Thanks @codeislife99 @t-vi |
FWIW, I don't think the test is really effective in ensuring that it's good. @torch.jit.script
def fn(x):
_ = x.unsqueeze_(2)
y = x *2
return y
m,p = tvm.relay.frontend.from_pytorch(fn, [('input', [5, 5])])
m2 = tvm.relay.transform.InferType()(m)
print(m2['main'].body.checked_type)
print(fn(torch.randn(5,5)).shape) gives
|
@codeislife99 can you follow up? |
* Added Ops * Regular * Remove copy * Remove copy * Tests * Black Co-authored-by: Ubuntu <ubuntu@ip-172-31-27-149.us-east-2.compute.internal> Co-authored-by: Ubuntu <ubuntu@ip-172-31-19-34.us-east-2.compute.internal>
@t-vi WHen I try to use your example as a test-case , it works. I am trying to understand what the underlying problem is ... |
I don't know what the test case does, but I'm relatively sure it doesn't handle the update of the input by running the above example and getting a wrong shape in TVM. |
* Added Ops * Regular * Remove copy * Remove copy * Tests * Black Co-authored-by: Ubuntu <ubuntu@ip-172-31-27-149.us-east-2.compute.internal> Co-authored-by: Ubuntu <ubuntu@ip-172-31-19-34.us-east-2.compute.internal>
* Added Ops * Regular * Remove copy * Remove copy * Tests * Black Co-authored-by: Ubuntu <ubuntu@ip-172-31-27-149.us-east-2.compute.internal> Co-authored-by: Ubuntu <ubuntu@ip-172-31-19-34.us-east-2.compute.internal>
* Added Ops * Regular * Remove copy * Remove copy * Tests * Black Co-authored-by: Ubuntu <ubuntu@ip-172-31-27-149.us-east-2.compute.internal> Co-authored-by: Ubuntu <ubuntu@ip-172-31-19-34.us-east-2.compute.internal>
This is the testcase that I added to the test_forward_unsqueeze() function and it seems to pass , and I checked the shape of y. It shows it to be [5, 5, 1]. |
The subtlety is that the JIT tracer actually resolves this during tracing, but it will be buggy if the inplace input being used shows in the graph, e.g. through scripted parts of the model. |
To keep things transparent, I'm copying a terse bit of explanation from slack: To see what goes wrong and why the test doesn't catch it, compare the graph for traced and scripted in the above function (or in the equivalent module). They crucially differ in the input to So for just To properly support inplace (say for |
* Added Ops * Regular * Remove copy * Remove copy * Tests * Black Co-authored-by: Ubuntu <ubuntu@ip-172-31-27-149.us-east-2.compute.internal> Co-authored-by: Ubuntu <ubuntu@ip-172-31-19-34.us-east-2.compute.internal>
THis PR adds the above ops to PT Frontend to support customer models.