Dan's Segment 9

From Computational Statistics (CSE383M and CS395T)
Jump to navigation Jump to search

To Calculate

1. This is done using the Fourier Convolution Theorem as outlined in slide 3.

<math>\Phi(t)=\Phi_{X_1}(t)\Phi_{X_2}(t)=e^{i(\mu_1+\mu_2)t-\frac 12 (\sigma_1^2+\sigma_2^2)t^2}</math>

This new function is the characteristic function of a gaussian distribution with mean <math>\mu_1 + \mu_2</math> and variance <math>\sigma_1^2+\sigma_2^2</math>

2. For an exponential distribution <math>P(x) = \lambda e^{-\lambda x}</math> the characteristic function is the Fourier transform:

<math>\Phi_X(t)=\int_0^{\infty}e^{itx} \lambda e^{-\lambda x} dx = \frac \lambda{it-\lambda} e^{(it-\lambda) x}|_0^\infty = \frac \lambda{\lambda-it} </math>

We only integrate from 0 to infinity because the exponential function is zero for all x<0.


To Think About

2. I could see characteristic functions being useful for finding the moments of many different random variables added together through use of the convolution theorem. Otherwise, I'm not sure how useful they are and even in the case of adding variables I would not be surprised is a more brute force calculation turned out to be just as fast most of the time.