Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move slightly duplicated RVs to PyTensor #6479

Open
Tracked by #7053
ricardoV94 opened this issue Jan 24, 2023 · 3 comments
Open
Tracked by #7053

Move slightly duplicated RVs to PyTensor #6479

ricardoV94 opened this issue Jan 24, 2023 · 3 comments

Comments

@ricardoV94
Copy link
Member

Description

We have a few RVs implemented in PyMC with similar (or simply duplicated) forms in PyTensor. We should move those there / remove to avoid the code duplication.

  • WaldRV (extra param)
  • BetaClippedRV (small change in perform)
  • StudentTRV (duplicated)
  • WeibullBetaRV (extra param)
@yashvmanmode
Copy link

Hi @ricardoV94

I am Yash Manmode. I am a beginner in Open source Contribution. can you please give more details on this issue so that I can contribute to this issue.

thank You

@jiisa-k
Copy link

jiisa-k commented Jan 27, 2023

Hi @ricardoV94! If I understand correctly, these RVs have been implemented in PyMC and a similar or the same version has been implemented in PyTensor. Would you prefer moving those there and removing the implementation in PyMC or directly removing these implementations from PyMC?

@adithyalaks
Copy link

adithyalaks commented Nov 5, 2024

Hi, tried tackling this as part of PyData NYC '24. I think there's some uncertainty about what needs to be done here.

Taking the example of WaldRV in PyMC: We have an extra parameter alpha. But we also define a rng_fun that consumes alpha.

Then would the solution be to accept alpha as an argument here, and then simply add it to the return value here?

Or would it be to define an rng in PyTensor as is done in PyMC for WaldRV and pass it as a kwarg here rng= so that we do exactly as PyMC did?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants