-
Notifications
You must be signed in to change notification settings - Fork 155
Do not coerce gradients to TensorVariable #1685
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
jessegrabowski
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, but mypy is failing.
How did this come up?
I was trying to use it in the loop autodiff example. Ended up not wanting it but fixed it anyway |
This could cause spurious disconnected errors, because the tensorified variable was not in the graph of the cost
4fa414c to
42f1367
Compare
|
This should be fine now, lmk if you object to the mypy fix. The same thing came up in #1828 . If we use |
| _f: list[Variable] = [pytensor.tensor.as_tensor_variable(f)] | ||
| else: | ||
| _f = [pytensor.tensor.as_tensor_variable(x) for x in f] | ||
| if not isinstance(eval_points, Sequence): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Annoying thing is that Sequence includes str, so it's not always the thing we want. Fine here
* Do not coerce gradients to TensorVariable This could cause spurious disconnected errors, because the tensorified variable was not in the graph of the cost * Type-consistent checks --------- Co-authored-by: jessegrabowski <jessegrabowski@gmail.com>
This could cause spurious disconnected errors, because the tensorified variable was not in the graph of the cost
📚 Documentation preview 📚: https://pytensor--1685.org.readthedocs.build/en/1685/