# Seg41. Markov Chain Monte Carlo, Example 2

## Contents

## Skilled problem

### problem 1

Show that the waiting times (times between events) in a Poisson process are Exponentially distributed. (I think we've done this before.)

**A** This question is essentially similar to the class activity at segment 10, problem 2.

Let <math> N_t </math> be the number of events happen at time t, and <math> T_n </math> be the time at which the nth event happen, then

<math> p(N_t \ge n) = p(T_n \le t)</math>

Also from lecture we know <math>p(T_n=t) = \frac{\lambda^n}{(n-1)!}t^{n-1}e^{-\lambda t}</math>, and <math>P(T \le t) = \int_0^t p(T_n =x)dx </math>

Then

<math> p(N_t = n) = p(N_t \ge n) - p(N_t \ge n+1) = p(T_n \le t) - p(T_{n+1} \le t) = \frac{(\lambda t)^ne^{-\lambda t}}{n!} \ldots \ldots (*) </math>

Now for the waiting time we have no events happen, so plug in n = 0, we get

<math> p(N_t = 0) = e^{-\lambda t}</math>

If we normalize the probability, we can get

<math> p\prime(N_t = 0) = \frac{p(N_t = 0)}{\int_0^{\infin} p(N_t = 0) dt} = \lambda e^{-\lambda t}</math>, this follows exponential distribution.

### problem 2

Plot the pdf's of the waiting times between (a) every other Poisson event, and (b) every Poisson event at half the rate.

**A**

For(a), there's one event happen at given time, so using equation (*) in problem 1, plug in n=1, we get

<math> p(N_t = 1) = \lambda t e^{-\lambda t} </math>， if we normalize it, we get <math> p_{norm}(N_t = 1) = \lambda^2 t e^{-\lambda t} </math>

For(b),

<math> p(N_t = 0) = \frac{\lambda}2 e^{\frac{\lambda t}2} </math>

So the plot is:

def everyother(r,x): return r**2*x*np.exp(-r*x) def everyhalf(r,x): return r/2*np.exp(-r*x/2) xx = np.arange(0,10,0.1) plt.figure(figsize=(6,4)) plt.plot(xx,everyother(1.,xx),label='every other event') plt.plot(xx,everyhalf(1.,xx),label='half rate') plt.xlabel('time') plt.ylabel('prob') legend()

### problem 3

Show, using characteristic functions, that the waiting times between every Nth event in a Poisson process is Gamma distributed. (I think we've also done one before, but it is newly relevant in this segment.)

From problem 1 we proved wait time between every event is exponential distribution

<math> p(t) = \lambda e^{-\lambda t} </math>

The characteristic function would be

<math> \Phi(x) = \frac{\lambda}{\lambda - it} </math>

Thus the characteristic function between every Nth events is sum of N-1 waiting time between every events, thus the characteristic function is the product of all individual ones,

<math> \Phi(N) = (\frac{\lambda}{\lambda - it})^{N-1} </math>

Since the characteristic function of gamma function

<math> p(x) = \lambda^n \frac{x^{n-1}}{(n-1)!} e^{-\lambda x} </math> is <math> (\frac{\lambda}{\lambda - it})^{n-1} </math>

And since function and characteristic function are one-to-one mapping, thus the waiting time of every Nth events follow Gamma distribution.

## Thought problem

### problem 1

In slide 5, showing the results of the MCMC, how can we be sure (or, how can we gather quantitative evidence) that there won't be another discrete change in <math>k_1</math> or <math>k_2</math> if we keep running the model longer. That is, how can we measure convergence of the model?

### problem 2

Suppose you have two hypotheses: H1 is that a set of times <math>t_i</math> are being generated as every 26th event from a Poisson process with rate 26. H2 is that they are every 27th event from a Poisson process with rate 27. (The mean rate is thus the same in both cases.) How would you estimate the number <math>N</math> of data points <math>t_i</math> that you need to clearly distinguish between these hypotheses?