Difference between revisions of "Eleisha's Segment 19: The Chi-Square Statistic"

From Computational Statistics Course Wiki
Jump to navigation Jump to search
Line 6: Line 6:
  
 
Since  <math> \mathbf x </math> is a random draw from the mulitvariate normal, x can be written as  <math> \mathbf {x  = Ly + \mu} </math>, where y is filled with  
 
Since  <math> \mathbf x </math> is a random draw from the mulitvariate normal, x can be written as  <math> \mathbf {x  = Ly + \mu} </math>, where y is filled with  
<math> y_i \text{'s}  </math> drawn from a normal with a mean zero and standard deviation one.  (Shown in lecture 17)
+
<math> y_i \text{'s}  </math> drawn from a normal with a \mu zero and \sigma one.  (Shown in lecture 17)
  
 
If you manipulate <math> \mathbf {x  = Ly + \mu} </math> you get <math> \mathbf {Ly = x - \mu}  </math>
 
If you manipulate <math> \mathbf {x  = Ly + \mu} </math> you get <math> \mathbf {Ly = x - \mu}  </math>
Line 15: Line 15:
 
= (y^TL^T)(LL^T)^{-1}(Ly) = y^Ty}  = \sum y_i \text{'s} </math>
 
= (y^TL^T)(LL^T)^{-1}(Ly) = y^Ty}  = \sum y_i \text{'s} </math>
  
Since the <math> y_i <\math>\text{'s} are drawn for the normal distributions with  
+
Since the <math> y_i\text{'s} </math> are drawn for the normal distributions with mean \mu zero and \sigma one,  \sum y_i \text{'s} is the sum of squared t-values.
 +
 
 +
This means that t <math> ({\mathbf x-\mathbf\mu})^T{\mathbf\Sigma}^{-1}({\mathbf x-\mathbf\mu}) </math> is <math> \chi^2 </math> distributed.
  
 
<b>To Think About </b>
 
<b>To Think About </b>

Revision as of 10:29, 30 April 2014

To Calculate

1. Prove the assertion on lecture slide 5, namely that, for a multivariate normal distribution, the quantity where is a random draw from the multivariate normal, is distributed.

Must prove that is distributed.

Since is a random draw from the mulitvariate normal, x can be written as , where y is filled with drawn from a normal with a \mu zero and \sigma one. (Shown in lecture 17)

If you manipulate you get

Therefore

Since the are drawn for the normal distributions with mean \mu zero and \sigma one, \sum y_i \text{'s} is the sum of squared t-values.

This means that t is distributed.

To Think About

1. Why are we so interested in t-values? Why do we square them?

2. Suppose you measure a bunch of quantities , each of which is measured with a measurement accuracy and has a theoretically expected value . Describe in detail how you might use a chi-square test statistic as a p-value test to see if your theory is viable? Should your test be 1 or 2 tailed?

Back To: Eleisha Jackson