You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -393,7 +393,7 @@ Formula {eq}`eq:bayeslaw103` generalizes formula {eq}`eq:recur1`.
393
393
Formula {eq}`eq:bayeslaw103` can be regarded as a one step revision of prior probability $ \pi_0 $ after seeing
394
394
the batch of data $ \left\{ w_{i}\right\} _{i=1}^{t+1} $.
395
395
396
-
## What a type 1 Agent Learns when Mixture $H$ Generates Data
396
+
## What a type 1 agent learns when mixture $H$ generates data
397
397
398
398
We now study what happens when the mixture distribution $h;\alpha$ truly generated the data each period.
399
399
@@ -469,7 +469,7 @@ plot_π_seq(α = 0.2)
469
469
470
470
Evidently, $\alpha$ is having a big effect on the destination of $\pi_t$ as $t \rightarrow + \infty$
471
471
472
-
## Kullback-Leibler Divergence Governs Limit of $\pi_t$
472
+
## Kullback-Leibler divergence governs limit of $\pi_t$
473
473
474
474
To understand what determines whether the limit point of $\pi_t$ is $0$ or $1$ and how the answer depends on the true value of the mixing probability $\alpha \in (0,1) $ that generates
475
475
@@ -613,7 +613,7 @@ Kullback-Leibler divergence:
613
613
614
614
- When $\alpha$ is large, $KL_f < KL_g$ meaning the divergence of $f$ from $h$ is smaller than that of $g$ and so the limit point of $\pi_t$ is close to $1$.
615
615
616
-
## Type 2 Agent
616
+
## Type 2 agent
617
617
618
618
We now describe how our type 2 agent formulates his learning problem and what he eventually learns.
619
619
@@ -697,7 +697,7 @@ plt.show()
697
697
698
698
Evidently, the Bayesian posterior narrows in on the true value $\alpha = .8$ of the mixing parameter as the length of a history of observations grows.
699
699
700
-
## Concluding Remarks
700
+
## Concluding remarks
701
701
702
702
Our type 1 person deploys an incorrect statistical model.
0 commit comments