# Seg39. MCMC and Gibbs Sampling

## Contents

## Skilled problem

### problem 1

Suppose the domain of a model are the five integers <math>x = \{1,2,3,4,5\}</math>, and that your proposal distribution is: "When <math>x_1 = 2,3,4</math>, choose with equal probability <math>x_2 = x_1 \pm 1</math>. For <math>x_1=1</math> always choose <math>x_2 =2</math>. For <math>x_1=5</math> always choose <math>x_2 =4</math>. What is the acceptance probability <math>\alpha(x_1,x_2)</math> for all the possible values of <math>x_1</math> and <math>x_2</math>?

**A**: The question is a little bit confusing since if we want to calculate the acceptance probability, we need to know the proposal distribution and posterior probability for each state, and now we only know proposal distribution.

Assume all state have each probability, or <math> \pi(x=i) = \frac15</math> for i = 1,2,3,4,5

The proposal matrix will be:

x1\x2 | 1 | 2 | 3 | 4 | 5 |
---|---|---|---|---|---|

1 | 0 | 1 | 0 | 0 | 0 |

2 | .5 | 0 | .5 | 0 | 0 |

3 | 0 | .5 | 0 | .5 | 0 |

4 | 0 | 0 | .5 | 0 | .5 |

5 | 0 | 0 | 0 | 1 | 0 |

And the acceptance probability will be:

x1\x2 | 1 | 2 | 3 | 4 | 5 |
---|---|---|---|---|---|

1 | 0 | 2 | 0 | 0 | 0 |

2 | .5 | 0 | 1 | 0 | 0 |

3 | 0 | 1 | 0 | 1 | 0 |

4 | 0 | 0 | 1 | 0 | .5 |

5 | 0 | 0 | 0 | 2 | 0 |

### problem 2

Suppose the domain of a model is <math>-\infty < x < \infty</math> and your proposal distribution is (perversely),

<math>q(x_2|x_1) = \begin{cases}\tfrac{7}{2}\exp[-7(x_2-x_1]),\quad & x_2 \ge x_1 \\ \tfrac{5}{2}\exp[-5(x_1-x_2)],\quad & x_2 < x_1 \end{cases}</math>

Sketch this distribution as a function of <math>x_2-x_1</math>. Then, write down an expression for the acceptance probability <math>\alpha(x_1,x_2)</math>.

**A** Let a = <math> x_2 - x_1 </math>, then

<math>p(a) = \begin{cases}\tfrac{7}{2}\exp(-7a),\quad & a \ge 0 \\ \tfrac{5}{2}\exp(5a),\quad & a < 0 \end{cases}</math>

Here's the code:

xx = np.arange(-2,2,step=0.0001) def prob(x): if x >= 0: return 3.5*np.exp(-7*x) else: return 2.5*np.exp(5*x) yy = [prob(x) for x in xx] plt.plot(xx,yy)

Assume <math> \pi(x_1) = \pi(x_2)</math>, then

<math>\alpha(x1,x2) = \begin{cases}\tfrac{7}{5}\exp(-2(x_2-x_1)),\quad & x_2 > x_1 \\ 1,\quad & x_2 = x_1 \\ min(\tfrac{5}{7}\exp(2(x_1-x_2)),1),\quad & x_2 < x_1 \end{cases}</math>

or

<math>\alpha(x1,x2) = \begin{cases}\tfrac{7}{5}\exp(-2(x_2-x_1)),\quad & x_2 > x_1 \\ 1,\quad & x_2 \le x_1 \end{cases}</math>

## Thought problem

See today's class activity.

### problem 1

Suppose an urn contains 7 large orange balls, 3 medium purple balls, and 5 small green balls. When balls are drawn randomly, the larger ones are more likely to be drawn, in the proportions large:medium:small = 6:4:3. You want to draw exactly 6 balls, one at a time without replacement. How would you use Gibbs sampling to learn: (a) How often do you get 5 orange plus 2 of the same (non-orange) color? (b) What is the expectation (mean) of the product of the number of purple and number of green balls drawn?

### problem 2

How would you do the same problem computationally but without Gibbs sampling?

### problem 3

How would you do the same problem analytically? (Hint: This is known as the Wallenius non-central hypergeometric distribution.)

## Class activity

In Analytical team with Kai, Travis, LoriL, Tameem, for code see Kai.