# Difference between revisions of "Eleisha's Segment 9: Characteristic Functions"

Line 2: | Line 2: | ||

1. Use characteristic functions to show that the sum of two independent Gaussian random variables is itself a Gaussian random variable. What is its mean and variance? | 1. Use characteristic functions to show that the sum of two independent Gaussian random variables is itself a Gaussian random variable. What is its mean and variance? | ||

+ | |||

+ | The characteristic function of the sum of independent random variables is the product of their characteristic functions. | ||

+ | |||

+ | So let S = X + Y (the sum of two independent Gaussian random variables) | ||

+ | |||

+ | The characteristic functions of S is: <math> \phi_S(t) </math> | ||

+ | |||

## Revision as of 15:30, 22 February 2014

**To Calculate: **

1. Use characteristic functions to show that the sum of two independent Gaussian random variables is itself a Gaussian random variable. What is its mean and variance?

The characteristic function of the sum of independent random variables is the product of their characteristic functions.

So let S = X + Y (the sum of two independent Gaussian random variables)

The characteristic functions of S is:

2. Calculate (don't just look up) the characteristic function of the Exponential distribution.

**To Think About: **

1. Learn enough about contour integration to be able to make sense of Saul's explanation at the bottom of slide 7. Then draw a picture of the contours, label the pole(s), and show how you calculate their residues.

2. Do you think that characteristic functions are ever useful computationally (that is, not just analytically to prove theorems)?

**Back to: ** Eleisha Jackson