Skip to content

Add check for second value in sum: Logsumexp #90

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 11 commits into from

Conversation

shivam096
Copy link
Contributor

Added conditions to check if second value exists in "sum" for Logsumexp update. If not then no change.

Files updated:

  • torchfix/visitors/misc/init.py
  • tests/fixtures/misc/checker/logsumexp.py

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jan 23, 2025
@shivam096 shivam096 requested a review from kit1980 January 24, 2025 20:18
Comment on lines 19 to 20
y = torch.sum(torch.log(torch.exp(x)), dim=1)
y = torch.sum(torch.log(torch.exp(x)), dim=None)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change to the order of calls to log(sum(exp())) as we discussed.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought these were false test cases. Will update it

if (
self.get_specific_arg(
node.args[0].value, arg_name="dim", arg_pos=1
).value.value
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You only check for value of the argument when it's present.
It should be two nested if's - first if it's present, then if value is not None.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The first "if" condition on line 187 checks for the presence of "dim" and then if confirmed it is moved to the second "if" in line 190.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see now.
You should assign the arg to a variable to reduce code duplication.
And then add assert that it's not None, otherwise MyPy is complaining:

https://github.com/pytorch-labs/torchfix/actions/runs/13081681874/job/36506448560?pr=90

See this for example https://github.com/pytorch-labs/torchfix/blob/main/torchfix/visitors/deprecated_symbols/__init__.py#L35

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You may also need to use ensure_type, like here: https://github.com/pytorch-labs/torchfix/blob/main/torchfix/visitors/deprecated_symbols/qr.py#L19

Please run mypy locally and verify it passes.

Copy link
Contributor Author

@shivam096 shivam096 Jan 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I ran the code and mypy errors are only resolved when I do a isinstance() check.

And in doing so I need to check for both Integer and Tuple since the dim can have both of the types as its argument value and not of type Name which is there when None value is there.

And since value of tuple cannot be retrieved through .value. I need to update the code to different code structure that could handle both integer and tuple values for dim which are Integers and are not None.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have updated the code and have made the necessary changes to handle any future type based issues.

Updated the test cases as well

node.args[0].value, arg_name="dim", arg_pos=1
)
if dim_arg: # checks if dim argument is present
if isinstance(
Copy link
Contributor

@kit1980 kit1980 Feb 3, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These lines are redundant, no?
Later there are checks for cst.Integer and cst.Tuple.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. Removed it since they were test code.

dim_arg.value, cst.Tuple
): # checks if dim argument is an integer or tuple
if (
isinstance(dim_arg.value, cst.Integer)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cst.Integer can not be None, meaningless condition.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here the condition checks if the value is of type integer and also makes sure that the value it holds is also not None since Tuples in dim cannot have None values

y = torch.log(torch.sum(torch.exp(x)), dim=None) #dim is not part of the sum fuction call and dim is None
y = torch.log(torch.sum(torch.exp(x), keepdim=True, dim=None)) #dim argument cannot be None
y = torch.log(torch.sum(torch.exp(x), dim=(1,None))) #dim argument cannot be a tuple with None
y = torch.log(torch.sum(torch.exp(x), dim=(None,None))) #dim argument cannot be a tuple with None
Copy link
Contributor

@kit1980 kit1980 Feb 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to check for dim=(None,None) or dim=(1,None), it can not happen because if present dim is an int or tuple of ints: https://pytorch.org/docs/stable/generated/torch.sum.html

@kit1980
Copy link
Contributor

kit1980 commented Feb 4, 2025

#91

@kit1980
Copy link
Contributor

kit1980 commented Feb 4, 2025

closing this in favor of #91

@kit1980 kit1980 closed this Feb 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants