(DT) Segment 17: The Multivariate Normal Distribution

From Computational Statistics Course Wiki
Jump to navigation Jump to search

To Calculate

Question 1: The Jacobian determinant for the variable transformation is,


Question 2:

  • For the first part, consider the Cholesky decomposition of the inverse of the covariance matrix, , where,

Then, using the transformation , we can decompose the multivariate normal distribution into the product of 3 standard normal distributions, that is, the mean of will be a zero vector. But, from the transformation , we have that is a linear combination of the components of , and therefore, by linearity of the expectations, it can be seen that will have zero vector as its mean. First, putting in , we obtain the following relationship between the remaining two dimensions of and ,


where is the top-left 2x2 submatrix of , and is the top-right 2x1 submatrix of . Now, note that from the above transformation law, and the expression for in terms of , it is evident that the inverse transformation would yield the following as our new 2D covariance matrix,


Then, we have the following,


Then, solving for , we obtain the new mean vector as,

  • For part two, let's try and alternate approach where we try to separate the lost dimensions from the remaining dimensions by completing the square in the exponent of . I use the following notation for conciseness, but it's not required if you are not into the whole brevity thing.

Then, the square in the exponent is simply,


It is clear from the above decomposition of the exponent that after marginalizing over , only the first term in the above expression is left, and this means that, if the new mean and covariance are ,


where now denotes only the first two dimensions i.e. the dimensions that remain. Then, it is evident then that,