Homepage Solution manuals Kevin P. Murphy Machine Learning: a Probabilistic Perspective Exercise 4.2 - Uncorrelated and Gaussian does not imply independent unless jointly Gaussian

Exercise 4.2 - Uncorrelated and Gaussian does not imply independent unless jointly Gaussian

Answers

For question (a). The p.d.f. for Y is:

p ( Y [ a , a + d a ] ) = 0.5 p ( X [ a , a + d a ] ) + 0.5 p ( X [ a d a ] , a ) = p ( X [ a , a + d a ] ) ,

since X is symmetric. So Y subject to a normal distribution ( 0 , 1 ) .

For question (b), we have:

cov ( X , Y ) = 𝔼 ( 𝑋𝑌 ) 𝔼 ( X ) 𝔼 ( Y ) = 𝔼 W ( 𝔼 ( 𝑋𝑌 | W ) ) 0 = 0.5 𝔼 ( X 2 ) + 0.5 𝔼 ( X 2 ) = 0 .

So they are uncorrelated.

To disprove dependence (in case of confusion), let:

a = Φ 1 ( 1 4 ) ,

where Φ is the c.d.f of X , i.e.:

a 𝒩 ( x | 0 , 1 ) d x = 1 4 .

Let R 1 = ( , a ] , R 2 = ( a , 0 ] . The space of experiment results for X × Y is R 2 . Let R 1 × R 2 be a Borel set in R 2 . Be X and Y independent, its probability measure should be 1 16 . However, when X R 1 , it is impossible for Y to take a value from R 2 . Hence the independency fails.

The rule of iterated expectation is but the Bayes rule:

𝔼 [ 𝑋𝑌 ] = X Y 𝑋𝑌 p ( X , Y ) d X d Y = X Y 𝑋𝑌 ( W p ( X , Y , W ) d W ) d X d Y = W ( X Y 𝑋𝑌 p ( X , Y | W ) ) p ( W ) d W .

User profile picture
2021-03-24 13:42
Comments