# Segment 30. Expectation Maximization (EM) Methods

## Contents

#### Watch this segment

(Don't worry, what you see statically below is not the beginning of the segment. Press the play button to start at the beginning.)

{{#widget:Iframe |url=http://www.youtube.com/v/StQOzRqTNsw&hd=1 |width=800 |height=625 |border=0 }}

Links to the slides: PDF file or PowerPoint file

### Problems

#### To Calculate

1. For a set of positive values $\{x_i\}$, use Jensen's inequality to show (a) the mean of their square is never less than the square of their mean, and (b) their (arithmetic) mean is never less than their harmonic mean.

2. Sharpen the argument about termination of E-M methods that was given in slide 4, as follows: Suppose that $g(x) \ge f(x)$ for all $x$, for some two functions $f$ and $g$. Prove that, at any local maximum $x_m$ of $f$, one of these two conditions must hold: (1) $g(x_m) > f(x_m)$ [in which case the E-M algorithm has not yet terminated], or (2) $g(x_m)$ is a local maximum of $g$ [in which case the E-M algorithm terminates at a maximum of $g$, as advertised]. You can make any reasonable assumption about continuity of the functions.

2. So slide 4 proves that some function is less than the actual function of interest, namely $L(\theta)$. What makes this such a powerful idea?