Difference between revisions of "Eleisha's Segment 19: The Chi-Square Statistic"

To Calculate

1. Prove the assertion on lecture slide 5, namely that, for a multivariate normal distribution, the quantity $\displaystyle ({\mathbf x-\mathbf\mu})^T{\mathbf\Sigma}^{-1}({\mathbf x-\mathbf\mu}),$ where $\displaystyle \mathbf x$ is a random draw from the multivariate normal, is $\displaystyle \chi^2$ distributed.

Must prove that $\displaystyle ({\mathbf x-\mathbf\mu})^T{\mathbf\Sigma}^{-1}({\mathbf x-\mathbf\mu})$ is $\displaystyle \chi^2$ distributed.

Since $\displaystyle \mathbf x$ is a random draw from the mulitvariate normal, x can be written as $\displaystyle \mathbf {x = Ly + \mu}$ , where y is filled with $\displaystyle y_i \text{'s}$ drawn from a normal with $\displaystyle \mu = 0$ and $\displaystyle \sigma = 1$ . (Shown in lecture 17)

If you manipulate $\displaystyle \mathbf {x = Ly + \mu}$ you get $\displaystyle \mathbf {Ly = x - \mu}$

Therefore

$\displaystyle ({\mathbf x-\mathbf\mu})^T{\mathbf\Sigma}^{-1}({\mathbf x-\mathbf\mu}) = \mathbf{(Ly)^T \Sigma^{-1} (Ly) = (y^TL^T)\Sigma^{-1}(Ly) = (y^TL^T)(LL^T)^{-1}(Ly) = y^Ty} = \sum_i y_i \text{'s}$

Since the $\displaystyle y_i\text{'s}$ are drawn for the normal distributions with $\displaystyle \mu = 0$ and $\displaystyle \sigma = 1$ , $\displaystyle \sum_i y_i \text{'s}$ is the sum of squared t-values.

This means that t $\displaystyle ({\mathbf x-\mathbf\mu})^T{\mathbf\Sigma}^{-1}({\mathbf x-\mathbf\mu})$ is $\displaystyle \chi^2$ distributed.

2. Suppose you measure a bunch of quantities $\displaystyle x_i$ , each of which is measured with a measurement accuracy $\displaystyle \sigma_i$ and has a theoretically expected value $\displaystyle \mu_i$ . Describe in detail how you might use a chi-square test statistic as a p-value test to see if your theory is viable? Should your test be 1 or 2 tailed?