Homepage Solution manuals Terence Tao Probability Theory Exercise 2.29 (Gaussian distribution)

Exercise 2.29 (Gaussian distribution)

A random vector (X1,,Xn) taking values in Rn is said to be a gaussian random vector if there exists μ = (μ1,,μn) Rn and an n × n positive definite real symmetric matrix Σ := (σij)1i,jn such that

P((X1,,Xn) S) = 1 (2π)n2(det Σ)12Se1 2 (xμ)TΣ1(xμ) dx

for all Borel sets S Rn (where we identify elements of Rn with column vectors). The distribution of (X1,,Xn) is called a multivariate normal distribution.

1.
If (X1,,Xn) is a gaussian random vector with the indicated parameters μ,Σ, show that EXi = μi and Cov(Xi,Xj) = σij for 1 i,j n. In particular Var(Xi) = σii. Thus we see that the parameters of a gaussian random variable can be recovered from the mean and covariances.
2.
If (X1,,Xn) is a gaussian random vector and 1 i,j n, show that Xi and Xj are independent if and only if the covariance Cov(Xi,Xj) vanishes. Furthermore, show that (X1,,Xn) are jointly independent if and only if all the covariances Cov(Xi,Xj) for 1 i < j n vanish. In particular, for gaussian random vectors, joint independence is equivalent to pairwise independence. (Contrast this with Exercise 23.)
3.
Give an example of two real random variables X,Y , each of which is gaussian, and for which Cov(X,Y ) = 0, but such that X and Y are not independent. Why does this not contradict (ii)?