We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 4fd9a10 commit 31e4b36Copy full SHA for 31e4b36
labml_nn/optimizers/amsgrad.py
@@ -130,7 +130,7 @@ def _synthetic_experiment(is_adam: bool):
130
131
We measure the performance of the optimizer as the regret,
132
$$R(T) = \sum_{t=1}^T \big[ f_t(\theta_t) - f_t(\theta^*) \big]$$
133
- where $theta_t$ is the parameters at time step $t$, and $\theta^*$ is the
+ where $\theta_t$ is the parameters at time step $t$, and $\theta^*$ is the
134
optimal parameters that minimize $\mathbb{E}[f(\theta)]$.
135
136
Now lets define the synthetic problem,
0 commit comments