Segment 9

From Computational Statistics (CSE383M and CS395T)
Jump to navigation Jump to search

Calculation Problems

1. Show that the sum of two independent Gaussian random variables is itself a Gaussian random variable. What is the mean and variance?

Let <math>X</math> and <math>Y</math> be two independent Gaussian random variables. with mean <math>\mu_X </math>and <math>\mu_Y </math> and variance <math> \sigma_X^2</math> and <math> \sigma_Y^2 </math>
<math> \Phi_X(t) = exp \left( it\mu_X - \frac{\sigma_X^2 t^2}{2}\right)</math>

Similarly for <math>\Phi_Y(t)</math>.

Since the sum of two independent random variable X and Y is the product of the two seperate characteristic function.

<math>\begin{align} \Phi_{X+Y}(t) &= \Phi_X(t)\Phi_Y(t) \\ &= exp \left( it\mu_X - \frac{\sigma_X^2 t^2}{2}\right) exp \left( it\mu_Y - \frac{\sigma_Y^2 t^2}{2}\right)\\ &= exp\left( it(\mu_X + \mu_Y) - \frac{(\sigma_X^2 + \sigma_Y^2)t^2}{2}\right) \\ \Phi_Z(t)&= exp\left ( it\mu_Z - \frac{\sigma_Z^2 t^2}{2}\right) \end{align} </math>

The new random variable <math> Z = X + Y</math> has mean <math> \mu_X + \mu_Y </math> and variance <math> \sigma_X^2 + \sigma_Y^2 </math>.


2. Calculate the characteristic function of the Exponential distribution.

<math>\begin{align}

p(x) &= \beta e^{-\beta x} & x \ge 0 \\

\Phi(t) &= \int_0^{\infin} e^{itx} p(x) \\ &= \int_0^{\infin} e&{itx} \beta e^{-\beta x} \\ &= \left|\frac{\beta}{it - \beta} e^{(it-\beta)x} \right|_0^{\infin}\\ &= \frac{\beta}{\beta - it} \end{align} </math>


Food for Thought Problems

Class Activity

Group : Noah, Kai, Tameen, Jin, Trettels