Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Laplace approximation now uses Multivariate Normal's #506

Merged
merged 6 commits into from
Mar 5, 2017

Conversation

dustinvtran
Copy link
Member

@dustinvtran dustinvtran commented Mar 5, 2017

Previously, ed.Laplace ran MAP and then printed out the precision matrix. That is not useful. Now ed.Laplace properly takes in multivariate normal random variables as input, optimizes their means via MAP, and finally sets their covariance matrices to be the inverse of the observed Fisher information.

See test_laplace.py for the many ways you can instantiate ed.Laplace. They follow the many subclasses of MultivariateNormal random variables.

Remarks

  • I replaced the ed.hessian utility function in favor of TensorFlow's tf.hessians. This significantly speeds up their calculation.

@dustinvtran dustinvtran force-pushed the feature/laplace branch 2 times, most recently from b5851b3 to 1de77bb Compare March 5, 2017 13:09
@dustinvtran dustinvtran merged commit fae636e into master Mar 5, 2017
@dustinvtran dustinvtran deleted the feature/laplace branch March 5, 2017 22:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

laplace approximation should use normal as approximating families
1 participant