# Seg41. Markov Chain Monte Carlo, Example 2

## Skilled problem

### problem 1

Show that the waiting times (times between events) in a Poisson process are Exponentially distributed. (I think we've done this before.)

A This question is essentially similar to the class activity at segment 10, problem 2.

Let $N_t$ be the number of events happen at time t, and $T_n$ be the time at which the nth event happen, then

$p(N_t \ge n) = p(T_n \le t)$

Also from lecture we know $p(T_n=t) = \frac{\lambda^n}{(n-1)!}t^{n-1}e^{-\lambda t}$, and $P(T \le t) = \int_0^t p(T_n =x)dx$

Then

$p(N_t = n) = p(N_t \ge n) - p(N_t \ge n+1) = p(T_n \le t) - p(T_{n+1} \le t) = \frac{(\lambda t)^ne^{-\lambda t}}{n!} \ldots \ldots (*)$

Now for the waiting time we have no events happen, so plug in n = 0, we get

$p(N_t = 0) = e^{-\lambda t}$

If we normalize the probability, we can get

$p\prime(N_t = 0) = \frac{p(N_t = 0)}{\int_0^{\infin} p(N_t = 0) dt} = \lambda e^{-\lambda t}$, this follows exponential distribution.

### problem 2

Plot the pdf's of the waiting times between (a) every other Poisson event, and (b) every Poisson event at half the rate.

A

For(a), there's one event happen at given time, so using equation (*) in problem 1, plug in n=1, we get

$p(N_t = 1) = \lambda t e^{-\lambda t}$， if we normalize it, we get $p_{norm}(N_t = 1) = \lambda^2 t e^{-\lambda t}$

For(b),

$p(N_t = 0) = \frac{\lambda}2 e^{\frac{\lambda t}2}$

So the plot is:

def everyother(r,x):
return r**2*x*np.exp(-r*x)

def everyhalf(r,x):
return r/2*np.exp(-r*x/2)

xx = np.arange(0,10,0.1)

plt.figure(figsize=(6,4))
plt.plot(xx,everyother(1.,xx),label='every other event')
plt.plot(xx,everyhalf(1.,xx),label='half rate')
plt.xlabel('time')
plt.ylabel('prob')
legend()


### problem 3

Show, using characteristic functions, that the waiting times between every Nth event in a Poisson process is Gamma distributed. (I think we've also done one before, but it is newly relevant in this segment.)

From problem 1 we proved wait time between every event is exponential distribution

$p(t) = \lambda e^{-\lambda t}$

The characteristic function would be

$\Phi(x) = \frac{\lambda}{\lambda - it}$

Thus the characteristic function between every Nth events is sum of N-1 waiting time between every events, thus the characteristic function is the product of all individual ones,

$\Phi(N) = (\frac{\lambda}{\lambda - it})^{N-1}$

Since the characteristic function of gamma function

$p(x) = \lambda^n \frac{x^{n-1}}{(n-1)!} e^{-\lambda x}$ is $(\frac{\lambda}{\lambda - it})^{n-1}$

And since function and characteristic function are one-to-one mapping, thus the waiting time of every Nth events follow Gamma distribution.

## Thought problem

### problem 1

In slide 5, showing the results of the MCMC, how can we be sure (or, how can we gather quantitative evidence) that there won't be another discrete change in $k_1$ or $k_2$ if we keep running the model longer. That is, how can we measure convergence of the model?

### problem 2

Suppose you have two hypotheses: H1 is that a set of times $t_i$ are being generated as every 26th event from a Poisson process with rate 26. H2 is that they are every 27th event from a Poisson process with rate 27. (The mean rate is thus the same in both cases.) How would you estimate the number $N$ of data points $t_i$ that you need to clearly distinguish between these hypotheses?