# Difference between revisions of "Eleisha's Segment 19: The Chi-Square Statistic"

To Calculate

1. Prove the assertion on lecture slide 5, namely that, for a multivariate normal distribution, the quantity ${\displaystyle ({\mathbf {x} -\mathbf {\mu } })^{T}{\mathbf {\Sigma } }^{-1}({\mathbf {x} -\mathbf {\mu } }),}$ where ${\displaystyle \mathbf {x} }$is a random draw from the multivariate normal, is ${\displaystyle \chi ^{2}}$ distributed.

Must prove that ${\displaystyle ({\mathbf {x} -\mathbf {\mu } })^{T}{\mathbf {\Sigma } }^{-1}({\mathbf {x} -\mathbf {\mu } })}$ is ${\displaystyle \chi ^{2}}$ distributed.

Since ${\displaystyle \mathbf {x} }$ is a random draw from the mulitvariate normal, x can be written as ${\displaystyle \mathbf {x=Ly+\mu } }$, where y is filled with ${\displaystyle y_{i}{\text{'s}}}$ drawn from a normal with a \mu zero and \sigma one. (Shown in lecture 17)

If you manipulate ${\displaystyle \mathbf {x=Ly+\mu } }$ you get ${\displaystyle \mathbf {Ly=x-\mu } }$

Therefore

${\displaystyle ({\mathbf {x} -\mathbf {\mu } })^{T}{\mathbf {\Sigma } }^{-1}({\mathbf {x} -\mathbf {\mu } })=\mathbf {(Ly)^{T}\Sigma ^{-1}(Ly)=(y^{T}L^{T})\Sigma ^{-1}(Ly)=(y^{T}L^{T})(LL^{T})^{-1}(Ly)=y^{T}y} =\sum y_{i}{\text{'s}}}$

Since the ${\displaystyle y_{i}{\text{'s}}}$ are drawn for the normal distributions with mean \mu zero and \sigma one, \sum y_i \text{'s} is the sum of squared t-values.

This means that t ${\displaystyle ({\mathbf {x} -\mathbf {\mu } })^{T}{\mathbf {\Sigma } }^{-1}({\mathbf {x} -\mathbf {\mu } })}$ is ${\displaystyle \chi ^{2}}$ distributed.

2. Suppose you measure a bunch of quantities ${\displaystyle x_{i}}$, each of which is measured with a measurement accuracy ${\displaystyle \sigma _{i}}$ and has a theoretically expected value ${\displaystyle \mu _{i}}$. Describe in detail how you might use a chi-square test statistic as a p-value test to see if your theory is viable? Should your test be 1 or 2 tailed?