# Eleisha's Segment 30: Expectation Maximization (EM) Methods

** To Calculate **
1. For a set of positive values **Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle \{x_i\} }**
, use Jensen's inequality to show (a) the mean of their square is never less than the square of their mean, and (b) their (arithmetic) mean is never less than their harmonic mean.

2. Sharpen the argument about termination of E-M methods that was given in slide 4, as follows: Suppose that g(x) \ge f(x) for all x, for some two functions f and g. Prove that, at any local maximum x_m of f, one of these two conditions must hold: (1) g(x_m) > f(x_m) [in which case the E-M algorithm has not yet terminated], or (2) g(x_m) is a local maximum of g [in which case the E-M algorithm terminates at a maximum of g, as advertised]. You can make any reasonable assumption about continuity of the functions.

** To Think About **
1. Jensen's inequality says something like "any concave function of a mixture of things is greater than the same mixture of the individual concave functions". What "mixture of things" is this idea being applied to in the proof of the E-M theorem (slide 4)?

2. So slide 4 proves that some function is less than the actual function of interest, namely **Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle L(\theta) }**
. What makes this such a powerful idea?

** Back To: ** Eleisha Jackson