Skip to content

Commit

Permalink
Boolean masking backwards doesn't work even with dynamic output shape…
Browse files Browse the repository at this point in the history
… ops, break accordingly (pytorch#114126)

Signed-off-by: Edward Z. Yang <ezyang@meta.com>

Pull Request resolved: pytorch#114126
Approved by: https://github.com/albanD
  • Loading branch information
ezyang authored and pytorchmergebot committed Nov 20, 2023
1 parent 039a468 commit 934e9c3
Showing 1 changed file with 2 additions and 3 deletions.
5 changes: 2 additions & 3 deletions torch/_dynamo/variables/tensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -589,13 +589,12 @@ def has_bool_key(v):
return False

if (
not config.capture_dynamic_output_shape_ops
and has_bool_key(key)
has_bool_key(key)
and isinstance(value, TensorVariable)
and value.requires_grad
):
unimplemented(
"boolean masking setitem backwards requires dynamic shapes"
"boolean masking setitem backwards, see https://github.com/pytorch/pytorch/issues/114123"
)
tx.output.create_proxy(
"call_function",
Expand Down

0 comments on commit 934e9c3

Please sign in to comment.