Skip to content

Update Ch1_Introduction_PyMC2.ipynb #342

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 17, 2017
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Chapter1_Introduction/Ch1_Introduction_PyMC2.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -399,7 +399,7 @@
"- **$Z$ is mixed**: Mixed random variables assign probabilities to both discrete and continuous random variables, i.e. it is a combination of the above two categories. \n",
"\n",
"#### Expected Value\n",
"Expected value (EV) is one of the most important concepts in probability. The EV for a given probability distribution can be described as \"the mean value in the long run for many repeated samples from that distribution.\" To borrow a metaphor from physics, a distribution's EV as like its \"center of mass.\" Imagine repeating the same experiment many times over, and taking the average over each outcome. The more you repeat the experiment, the closer this average will become to the distributions EV. (side note: as the number of repeated experiments goes to infinity, the difference between the average outcome and the EV becomes arbitrarily small.)\n",
"Expected value (EV) is one of the most important concepts in probability. The EV for a given probability distribution can be described as \"the mean value in the long run for many repeated samples from that distribution.\" To borrow a metaphor from physics, a distribution's EV acts like its \"center of mass.\" Imagine repeating the same experiment many times over, and taking the average over each outcome. The more you repeat the experiment, the closer this average will become to the distributions EV. (side note: as the number of repeated experiments goes to infinity, the difference between the average outcome and the EV becomes arbitrarily small.)\n",
"\n",
"### Discrete Case\n",
"If $Z$ is discrete, then its distribution is called a *probability mass function*, which measures the probability $Z$ takes on the value $k$, denoted $P(Z=k)$. Note that the probability mass function completely describes the random variable $Z$, that is, if we know the mass function, we know how $Z$ should behave. There are popular probability mass functions that consistently appear: we will introduce them as needed, but let's introduce the first very useful probability mass function. We say $Z$ is *Poisson*-distributed if:\n",
Expand Down