Skip to content

Commit

Permalink
revise docs
Browse files Browse the repository at this point in the history
  • Loading branch information
dustinvtran committed Mar 5, 2017
1 parent fc3e53e commit 0e53717
Show file tree
Hide file tree
Showing 3 changed files with 22 additions and 3 deletions.
7 changes: 7 additions & 0 deletions docs/tex/bib.bib
Original file line number Diff line number Diff line change
Expand Up @@ -652,3 +652,10 @@ @article{marin2012approximate
number = {6},
pages = {1167--1180}
}

@article{fisher1925theory,
author = {Fisher, R A},
title = {{Theory of statistical estimation}},
journal = {Mathematical Proceedings of the Cambridge {\ldots}},
year = {1925}
}
7 changes: 4 additions & 3 deletions docs/tex/tutorials/map-laplace.tex
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,10 @@ \subsection{Laplace approximation}
\end{align*}
This requires computing a precision matrix $\Lambda$. Derived from a
Taylor expansion, the Laplace approximation uses the Hessian of the
negative log joint density at the MAP estimate. It is defined
component-wise as
negative log joint density at the MAP estimate. For flat priors
(equivalent to maximum likelihood), the precision matrix is known
as the observed Fisher information \citep{fisher1925theory}.
It is defined component-wise as
\begin{align*}
\Lambda_{ij}
&=
Expand All @@ -38,4 +40,3 @@ \subsection{Laplace approximation}
implementation in Edward's code base.

\subsubsection{References}\label{references}

11 changes: 11 additions & 0 deletions edward/inferences/laplace.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,17 @@ def __init__(self, latent_vars, data=None, model_wrapper=None):
the diagonal. This does not capture correlation among the
variables but it does not require a potentially expensive matrix
inversion.
Examples
--------
>>> X = tf.placeholder(tf.float32, [N, D])
>>> w = Normal(mu=tf.zeros(D), sigma=tf.ones(D))
>>> y = Normal(mu=ed.dot(X, w), sigma=tf.ones(N))
>>>
>>> qw = MultivariateNormalFull(mu=tf.Variable(tf.random_normal([D])),
>>> sigma=tf.Variable(tf.random_normal([D, D])))
>>>
>>> inference = ed.Laplace({w: qw}, data={X: X_train, y: y_train})
"""
if isinstance(latent_vars, list):
with tf.variable_scope("posterior"):
Expand Down

0 comments on commit 0e53717

Please sign in to comment.