Skip to content

Commit

Permalink
Fix typo in nbviewer path
Browse files Browse the repository at this point in the history
  • Loading branch information
krasserm committed Sep 21, 2020
1 parent f698c5b commit 71b85ec
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions autoencoder-applications/variational_autoencoder_dfc.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
"\n",
"### Plain VAE\n",
"\n",
"In a [previous article](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev//autoencoder-applications/variational_autoencoder.ipynb) I introduced the variational autoencoder (VAE) and how it can be trained with a variational lower bound $\\mathcal{L}$ as optimization objective using stochastic gradient ascent methods. In context of stochastic gradient descent its negative value is used as loss function $L_{vae}$ which is a sum of a reconstruction loss $L_{rec}$ and a regularization term $L_{kl}$:\n",
"In a [previous article](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder.ipynb) I introduced the variational autoencoder (VAE) and how it can be trained with a variational lower bound $\\mathcal{L}$ as optimization objective using stochastic gradient ascent methods. In context of stochastic gradient descent its negative value is used as loss function $L_{vae}$ which is a sum of a reconstruction loss $L_{rec}$ and a regularization term $L_{kl}$:\n",
"\n",
"$$\n",
"\\begin{align*}\n",
Expand Down Expand Up @@ -771,4 +771,4 @@
},
"nbformat": 4,
"nbformat_minor": 2
}
}

0 comments on commit 71b85ec

Please sign in to comment.