Segment 30. Expectation Maximization (EM) Methods

From Computational Statistics Course Wiki
Jump to: navigation, search

Watch this segment

(Don't worry, what you see statically below is not the beginning of the segment. Press the play button to start at the beginning.)

The direct YouTube link is http://youtu.be/StQOzRqTNsw

Links to the slides: PDF file or PowerPoint file

Problems

To Calculate

1. For a set of positive values Failed to parse (unknown error): \{x_i\} , use Jensen's inequality to show (a) the mean of their square is never less than the square of their mean, and (b) their (arithmetic) mean is never less than their harmonic mean.

2. Sharpen the argument about termination of E-M methods that was given in slide 4, as follows: Suppose that Failed to parse (unknown error): g(x) \ge f(x) for all Failed to parse (unknown error): x , for some two functions Failed to parse (unknown error): f and Failed to parse (unknown error): g . Prove that, at any local maximum Failed to parse (unknown error): x_m of Failed to parse (unknown error): f , one of these two conditions must hold: (1) Failed to parse (unknown error): g(x_m) > f(x_m) [in which case the E-M algorithm has not yet terminated], or (2) Failed to parse (unknown error): g(x_m) is a local maximum of Failed to parse (unknown error): g [in which case the E-M algorithm terminates at a maximum of Failed to parse (unknown error): g , as advertised]. You can make any reasonable assumption about continuity of the functions.

To Think About

1. Jensen's inequality says something like "any concave function of a mixture of things is greater than the same mixture of the individual concave functions". What "mixture of things" is this idea being applied to in the proof of the E-M theorem (slide 4)?

2. So slide 4 proves that some function is less than the actual function of interest, namely Failed to parse (unknown error): L(\theta) . What makes this such a powerful idea?

Activity

The class activity for Friday can be found at EM activity.