Homepage › Solution manuals › Terence Tao › Probability Theory › Exercise 2.29 (Gaussian distribution)
Exercise 2.29 (Gaussian distribution)
A random vector taking values in is said to be a gaussian random vector if there exists and an positive definite real symmetric matrix such that
for all Borel sets (where we identify elements of with column vectors). The distribution of is called a multivariate normal distribution.
- 1.
- If is a gaussian random vector with the indicated parameters , show that and for . In particular . Thus we see that the parameters of a gaussian random variable can be recovered from the mean and covariances.
- 2.
- If is a gaussian random vector and , show that and are independent if and only if the covariance vanishes. Furthermore, show that are jointly independent if and only if all the covariances for vanish. In particular, for gaussian random vectors, joint independence is equivalent to pairwise independence. (Contrast this with Exercise 23.)
- 3.
- Give an example of two real random variables , each of which is gaussian, and for which , but such that and are not independent. Why does this not contradict (ii)?