Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Normal Inverse Gaussian and Softplus NaN Gradient #1835

Open
i418c opened this issue Aug 19, 2024 · 0 comments
Open

Normal Inverse Gaussian and Softplus NaN Gradient #1835

i418c opened this issue Aug 19, 2024 · 0 comments

Comments

@i418c
Copy link

i418c commented Aug 19, 2024

I've been encountering an error training my models with NaNs being introduced from the gradients during training. I think I've narrowed the cause down to be the combination of the normal inverse gaussian and a subsequent softplus bijector. I've tried reproducing with the normal distribution as well, but seem unable to. I haven't seen the issue with a bare distribution since #1778 was fixed, so I suspect this is something else.

A gist of the issue is here.
The model creation and fit function are in a loop because despite setting the seeds for TF and NP at the top, there is still some other source of randomness that causes it to only fail sometimes.

Any help would be appreciated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant