Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Infer logcdf of discrete transformations #7444

Merged
merged 2 commits into from
Aug 19, 2024

Conversation

ricardoV94
Copy link
Member

@ricardoV94 ricardoV94 commented Aug 1, 2024

Description

Allow logcdf inference for discrete transformations
With this we can also remove special case for MaxNeg (used to derive logprob of Min operations)


📚 Documentation preview 📚: https://pymc--7444.org.readthedocs.build/en/7444/

@ricardoV94 ricardoV94 force-pushed the simplify_order_logp branch 4 times, most recently from f042733 to 159da27 Compare August 6, 2024 12:39
Copy link

codecov bot commented Aug 6, 2024

Codecov Report

Attention: Patch coverage is 93.93939% with 2 lines in your changes missing coverage. Please review.

Project coverage is 92.17%. Comparing base (772825e) to head (e5899b4).
Report is 2 commits behind head on main.

Files Patch % Lines
pymc/logprob/order.py 93.10% 2 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #7444      +/-   ##
==========================================
- Coverage   92.17%   92.17%   -0.01%     
==========================================
  Files         103      103              
  Lines       17260    17217      -43     
==========================================
- Hits        15910    15869      -41     
+ Misses       1350     1348       -2     
Files Coverage Δ
pymc/logprob/transforms.py 95.47% <100.00%> (+0.01%) ⬆️
pymc/logprob/order.py 93.75% <93.10%> (-0.75%) ⬇️

@ricardoV94 ricardoV94 marked this pull request as ready for review August 6, 2024 21:38
Copy link
Member

@larryshamalama larryshamalama left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good as always @ricardoV94 :)

Just to clarify: when you say discrete transformations, do you mean monotonic transformations of discrete variables?

backward_value = op.transform_elemwise.backward(value, *other_inputs)

# Fail if transformation is not injective
# A TensorVariable is returned in 1-to-1 inversions, and a tuple in 1-to-many
if isinstance(backward_value, tuple):
raise NotImplementedError

is_discrete = measurable_input.type.dtype.startswith("int")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shall we use discrete_types here?

discrete_types = bool_types | int_types

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No strong preference

return max_rv
measurable_max_class = (
MeasurableMaxDiscrete if latent_base_var.type.dtype.startswith("int") else MeasurableMax
)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

idem comment as below

@ricardoV94
Copy link
Member Author

ricardoV94 commented Aug 7, 2024

Looks good as always @ricardoV94 :)

Just to clarify: when you say discrete transformations, do you mean monotonic transformations of discrete variables?

Yes. Right now only add and neg transforms are supported. We could add integer mul as well. Not keen to add support for integer->reals transforms just yet.

Otherwise for continuous RVs we have a single case of non monotonicity we support (pow with negative exponent):

elif isinstance(op.scalar_op, Pow):

@ricardoV94 ricardoV94 merged commit 48a8b6b into pymc-devs:main Aug 19, 2024
22 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants