# Segment 27. Mixture Models

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

## Contents

#### Watch this segment

(Don't worry, what you see statically below is not the beginning of the segment. Press the play button to start at the beginning.)

{{#widget:Iframe |url=http://www.youtube.com/v/9pWnZcpYh44&hd=1 |width=800 |height=625 |border=0 }}

The direct YouTube link is http://youtu.be/9pWnZcpYh44

Links to the slides: PDF file or PowerPoint file

### Problems

#### To Calculate

The file Media:Mixturevals.txt contains 1000 values, each drawn either with probability $\displaystyle c$ from the distribution $\displaystyle \text{Exponential}(\beta)$ (for some constant $\displaystyle \beta$ ), or otherwise (with probability $\displaystyle 1-c$ ) from the distribution $\displaystyle p(x) = (2/\pi)/(1+x^2),\; x>0$ .

1. Write down an expression for the probability of the file's data given some values for the parameters $\displaystyle \beta$ and $\displaystyle c$ .

2. Calculate numerically the maximum likelihood values of $\displaystyle \beta$ and $\displaystyle c$ .

3. Estimate numerically the Bayes posterior distribution of $\displaystyle \beta$ , marginalizing over $\displaystyle c$ as a nuisance parameter. (You'll of course have to make some assumption about priors.)

#### To Think About

1. In problem 3, above, you assumed some definite prior for $\displaystyle c$ . What if $\displaystyle c$ is itself drawn (just once for the whole data set) from a distribution $\displaystyle \text{Beta}(\mu,\nu)$ , with unknown hyperparameters $\displaystyle \mu,\nu$ . How would you now estimate the Bayes posterior distribution of $\displaystyle \beta$ , marginalizing over everything else?