# Segment 21: Marginalize or Condition Uninteresting Fitted Parameters - 3/27/2013

### Problem 1

**1. Consider a 2-dimensional multivariate normal distribution of the random variable <math>(b_1,b_2)</math> with 2-vector mean <math>(\mu_1,\mu_2)</math> and 2x2 matrix covariance <math>\Sigma</math>. What is the distribution of <math>b_1</math> given that <math>b_2</math> has the particular value <math>b_c</math>? In particular, what is the mean and standard deviation of the conditional distribution of <math>b_1</math>? (Hint, either see Wikipedia "Multivariate normal distribution" for the general case, or else just work out this special case.)**

Since this is along the lines of slicing, we can set our Mean and Covariance Matrix as follows

Mean = <math> E[b_1] + \Sigma_{12} \Sigma_{22}^{-1}{(b_c-E[b_2])} </math>

<math> \Sigma = \Sigma_{11} -\Sigma_{12} \Sigma_{22}^{-1} \Sigma_{21} </math>

### Problem 2

**2. Same, but marginalize over <math>b_2</math> instead of conditioning on it.**

<math> \Sigma=\Sigma_{11},

Mean=\mu_1 </math>

### To Think About 1

**Why should it be called the Fisher Information Matrix? What does it have to do with "information"?**

The Fisher Information Matrix has components that can tell us about the covariance matrix. We can use that information to see if particular random variables are independent to one another, or the variances of each particular random variable with a little bit of tweaking with the information.