Segment 10: Daniel Shepard

From Computational Statistics Course Wiki
Jump to navigation Jump to search

Problems

To Calculate

1. Take 12 random values, each uniform between 0 and 1. Add them up and subtract 6. Prove that the result is close to a random value drawn from the Normal distribution with mean zero and standard deviation 1.

Solution:

First, the subtraction by 6 can be absorbed into the uniform distributions to give the equivalent problem of summing 12 random values each distributed uniformly on -1/2 to 1/2. The characteristic function of this sum of random values is given by

   

The plot below shows that this characteristic function is nearly identical to that of a normal distribution with 0 mean and standard deviation of 1, . The actual mean and standard deviation computed from the characteristic function are exactly 0 and 1 respectively.

Segment10Prob1.png


2. Invent a family of functions, each different, that look like those in Slide 3: they all have value 1 at x = 0; they all have zero derivative at x = 0; and they generally (not necessarily monotonically) decrease to zero at large x. Now multiply 10 of them together and graph the result near the origin (i.e., reproduce what Slide 3 was sketching).

Solution:

I will use the family of sinc functions given by . The plot below shows the family of sinc functions for and the product of these functions, which appears to be approximately normally distributed.

Segment10Prob2.png


3. For what value(s) of does the Student distribution (Segment 8, Slide 4) have a convergent 1st and 2nd moment, but divergent 3rd and higher moments?

Solution:

The first moment of the student distribution (assuming and without loss of generality) is given by

   

This integral converges for . The second moment is given by

   

For large , the integrand is approximately equal to . Thus, this integral converges for . The third moment is given by

   

For large , the integrand is approximately equal to . Thus, this integral converges for . Higher moments will require higher values of to converge, so the answer is .


To Think About

1. A distribution with moments as in problem 3 above has a well-defined mean and variance. Does the CLT hold for the sum of RVs from such a distribution? If not, what goes wrong in the proof? Is the mean of the sum equal to the sum of the individual means? What about the variance of the sum? What, qualitatively, does the distribution of the sum of a bunch of them look like?


2. Give an explanation of Bessel's correction in the last expression on slide 5. If, as we see, the MAP calculation gives the factor 1/N, why would one ever want to use 1/(N-1) instead? (There are various wiki and stackoverflow pages on this. See if they make sense to you!)


Just for fun

A fun problem that ties in to 'To Calculate' 1 above and problem 6 from the Probability Blitz:

1. What is the expected number of Uniform[0,1] draws you need to add up before the sum exceeds 1? Prove your answer analytically and confirm it by simulation.