Releases: blei-lab/edward
Releases · blei-lab/edward
1.0.7
- hotfix to get
from . import models
working
1.0.6
- website with revamped documentation: http://edwardlib.org. See details in #108
- criticism of probabilistic models with
ed.evaluate()
anded.ppc()
. See details in #107
1.0.5
- enabled Keras as neural network specification
- samples in variational model can now leverage TensorFlow-based
samplers and not only SciPy-based samplers - let user optionally specify
sess
when usinginference
- mean-field variational inference can now take advantage of analytically tractable KL terms for standard normal priors
- data can additionally be a list of
np.ndarray
s or list of
tf.placeholder
s - added mixture density network as example
- enabled dimensions of distribution output to match with input dimensions
- renamed
log_gamma
,log_beta
,multivariate_log_beta
tolgamma
andlbeta
to follow convention in TensorFlow API - let
PointMass
be a variational factor - fixed
Multinomial
variational factor - added continuous integration for unit tests
1.0.4
- interface-wise, you now import models (probability models or variational models) using
from edward.models import PythonModel, Variational, Normal
By default you can also do something like ed.StanModel(model_file=model_file)
.
- variational distributions now default to initializing with only one factor
1.0.3
- generalized internals of variational distributions to use multivariate factors
- vectorized all distributions and with unit tests
- added additional distributions:
binom
,chi2
,geom
,lognorm
,nbinom
,uniform
- vectorized log density calls in variational distributions
- vectorized log density calls in model examples
1.0.2
- fixed bug in adding mean-field factorizations in
Variational()
- support for PyMC3
1.0.1
- some distribution fixes
- mixture model of gaussians example
- variational model interface
Initial release
Edward is a Python library for probabilistic modeling, inference, and criticism. It enables black box inference for models with discrete and continuous latent variables, neural network parameterizations, and infinite dimensional parameter spaces. Edward serves as a fusion of three fields: Bayesian statistics and machine learning, deep learning, and probabilistic programming.