Skip to content

Commit 7cba5eb

Browse files
committed
fix headings and subheadings to satisfy style guidelines
1 parent fb4ddb0 commit 7cba5eb

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

lectures/mix_model.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -206,7 +206,7 @@ l_arr_f = simulate(F_a, F_b, N=50000)
206206
l_seq_f = np.cumprod(l_arr_f, axis=1)
207207
```
208208

209-
## Sampling from Compound Lottery $H$
209+
## Sampling from compound lottery $H$
210210

211211
We implement two methods to draw samples from
212212
our mixture model $\alpha F + (1-\alpha) G$.
@@ -290,7 +290,7 @@ plt.legend()
290290
plt.show()
291291
```
292292

293-
## Type 1 Agent
293+
## Type 1 agent
294294

295295
We'll now study what our type 1 agent learns
296296

@@ -393,7 +393,7 @@ Formula {eq}`eq:bayeslaw103` generalizes formula {eq}`eq:recur1`.
393393
Formula {eq}`eq:bayeslaw103` can be regarded as a one step revision of prior probability $ \pi_0 $ after seeing
394394
the batch of data $ \left\{ w_{i}\right\} _{i=1}^{t+1} $.
395395
396-
## What a type 1 Agent Learns when Mixture $H$ Generates Data
396+
## What a type 1 agent learns when mixture $H$ generates data
397397
398398
We now study what happens when the mixture distribution $h;\alpha$ truly generated the data each period.
399399
@@ -469,7 +469,7 @@ plot_π_seq(α = 0.2)
469469
470470
Evidently, $\alpha$ is having a big effect on the destination of $\pi_t$ as $t \rightarrow + \infty$
471471
472-
## Kullback-Leibler Divergence Governs Limit of $\pi_t$
472+
## Kullback-Leibler divergence governs limit of $\pi_t$
473473
474474
To understand what determines whether the limit point of $\pi_t$ is $0$ or $1$ and how the answer depends on the true value of the mixing probability $\alpha \in (0,1) $ that generates
475475
@@ -613,7 +613,7 @@ Kullback-Leibler divergence:
613613

614614
- When $\alpha$ is large, $KL_f < KL_g$ meaning the divergence of $f$ from $h$ is smaller than that of $g$ and so the limit point of $\pi_t$ is close to $1$.
615615

616-
## Type 2 Agent
616+
## Type 2 agent
617617

618618
We now describe how our type 2 agent formulates his learning problem and what he eventually learns.
619619

@@ -697,7 +697,7 @@ plt.show()
697697

698698
Evidently, the Bayesian posterior narrows in on the true value $\alpha = .8$ of the mixing parameter as the length of a history of observations grows.
699699

700-
## Concluding Remarks
700+
## Concluding remarks
701701

702702
Our type 1 person deploys an incorrect statistical model.
703703

0 commit comments

Comments
 (0)