Repository for a workshop on Bayesian statistics
Bayesian Statistics Made Simple
Code and exercises from my workshop on Bayesian statistics in Python.
Copyright 2016 Allen Downey
MIT License: https://opensource.org/licenses/MIT
The World Cup Problem
We'll use λ to represent the hypothetical goal-scoring rate in goals per game.
To compute prior probabilities for values of λ, I'll use a Gamma distribution.
The mean is 1.3, which is the average number of goals per team per game in World Cup play.
Exercise: Write a class called Soccer
that extends Suite
and defines Likelihood
, which should compute the probability of the data (the time between goals in minutes) for a hypothetical goal-scoring rate, lam
, in goals per game.
Hint: For a given value of lam
, the time between goals is distributed exponentially.
Here's an outline to get you started:
Now we can create a Soccer
object and initialize it with the prior Pmf:
Here's the update after first goal at 11 minutes.
Here's the update after the second goal at 23 minutes (the time between first and second goals is 12 minutes).
This distribution represents our belief about lam
after two goals.
Estimating the predictive distribution
Now to predict the number of goals in the remaining 67 minutes. There are two sources of uncertainty:
We don't know the true value of λ.
Even if we did we wouldn't know how many goals would be scored.
We can quantify both sources of uncertainty at the same time, like this:
Choose a random values from the posterior distribution of λ.
Use the chosen value to generate a random number of goals.
If we run these steps many times, we can estimate the distribution of goals scored.
We can sample a value from the posterior like this:
Given lam
, the number of goals scored in the remaining 67 minutes comes from the Poisson distribution with parameter lam * t
, with t
in units of goals.
So we can generate a random value like this:
If we generate a large sample, we can see the shape of the distribution:
But that's based on a single value of lam
, so it doesn't take into account both sources of uncertainty. Instead, we should sample value values from the posterior distribution and generate one prediction for each.
Exercise: Write a few lines of code to
Use
Pmf.Sample
to generate a sample withn=10000
from the posterior distributionsoccer
.Use
np.random.poisson
to generate a random number of goals from the Poisson distribution with parameter , wheret
is the remaining time in the game (in units of games).Plot the distribution of the predicted number of goals, and print its mean.
What is the probability of scoring 5 or more goals in the remainder of the game?
Computing the predictive distribution
Alternatively, we can compute the predictive distribution by making a mixture of Poisson distributions.
MakePoissonPmf
makes a Pmf that represents a Poisson distribution.
If we assume that lam
is the mean of the posterior, we can generate a predictive distribution for the number of goals in the remainder of the game.
The predictive mean is about 2 goals.
And the chance of scoring 5 more goals is still small.
But that answer is only approximate because it does not take into account our uncertainty about lam
.
The correct method is to compute a weighted mixture of Poisson distributions, one for each possible value of lam
.
The following figure shows the different predictive distributions for the different values of lam
.
We can compute the mixture of these distributions by making a Meta-Pmf that maps from each Poisson Pmf to its probability.
MakeMixture
takes a Meta-Pmf (a Pmf that contains Pmfs) and returns a single Pmf that represents the weighted mixture of distributions:
Here's the result for the World Cup problem.
And here's what the mixture looks like.
Exercise: Compute the predictive mean and the probability of scoring 5 or more additional goals.