Homepage Solution manuals Kevin P. Murphy Machine Learning: a Probabilistic Perspective Exercise 2.12 - Expressing mutual information in terms of entropies

Exercise 2.12 - Expressing mutual information in terms of entropies

Answers

We have:

I ( X ; Y ) = x , y p ( x , y ) log p ( x , y ) p ( x ) p ( y ) = x , y p ( x , y ) log p ( x | y ) p ( x ) = x , y p ( x , y ) log p ( x | y ) x ( y p ( x , y ) ) log p ( x ) = H ( X | Y ) + H ( X ) .

Inversing X and Y yields another formula. One can proceed to show that I ( X ; Y ) = H ( X ) + H ( Y ) H ( X , Y ) .

User profile picture
2021-03-24 13:42
Comments