Skip to content

Commit 1f822ab

Browse files
Merge pull request CamDavidsonPilon#342 from CPaleske/master
Update Ch1_Introduction_PyMC2.ipynb
2 parents 6f1249d + 0bd5a91 commit 1f822ab

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

Chapter1_Introduction/Ch1_Introduction_PyMC2.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -399,7 +399,7 @@
399399
"- **$Z$ is mixed**: Mixed random variables assign probabilities to both discrete and continuous random variables, i.e. it is a combination of the above two categories. \n",
400400
"\n",
401401
"#### Expected Value\n",
402-
"Expected value (EV) is one of the most important concepts in probability. The EV for a given probability distribution can be described as \"the mean value in the long run for many repeated samples from that distribution.\" To borrow a metaphor from physics, a distribution's EV as like its \"center of mass.\" Imagine repeating the same experiment many times over, and taking the average over each outcome. The more you repeat the experiment, the closer this average will become to the distributions EV. (side note: as the number of repeated experiments goes to infinity, the difference between the average outcome and the EV becomes arbitrarily small.)\n",
402+
"Expected value (EV) is one of the most important concepts in probability. The EV for a given probability distribution can be described as \"the mean value in the long run for many repeated samples from that distribution.\" To borrow a metaphor from physics, a distribution's EV acts like its \"center of mass.\" Imagine repeating the same experiment many times over, and taking the average over each outcome. The more you repeat the experiment, the closer this average will become to the distributions EV. (side note: as the number of repeated experiments goes to infinity, the difference between the average outcome and the EV becomes arbitrarily small.)\n",
403403
"\n",
404404
"### Discrete Case\n",
405405
"If $Z$ is discrete, then its distribution is called a *probability mass function*, which measures the probability $Z$ takes on the value $k$, denoted $P(Z=k)$. Note that the probability mass function completely describes the random variable $Z$, that is, if we know the mass function, we know how $Z$ should behave. There are popular probability mass functions that consistently appear: we will introduce them as needed, but let's introduce the first very useful probability mass function. We say $Z$ is *Poisson*-distributed if:\n",

0 commit comments

Comments
 (0)