Homepage Solution manuals Kevin P. Murphy Machine Learning: a Probabilistic Perspective Exercise 3.21 - Mutual information for naive Bayes classifiers with binary features

Exercise 3.21 - Mutual information for naive Bayes classifiers with binary features

Answers

By definition:

I ( X ; Y ) = x j y p ( x j , y ) log p ( x j , y ) p ( x j ) p ( y ) .

For binary features, the value of x j is either zero or one. Given π c = p ( y = c ) , 𝜃 𝑗𝑐 = p ( x j = 1 | y = c ) , 𝜃 j = p ( x j = 1 ) , we have the mutual information between x j and Y be:

I j = c p ( x j = 1 , c ) log p ( x j = 1 , c ) p ( x j = 1 ) p ( c ) + c p ( x j = 0 , c ) log p ( x j = 0 , c ) p ( x j = 0 ) p ( c ) = c π c 𝜃 𝑗𝑐 log 𝜃 𝑗𝑐 𝜃 j + ( 1 𝜃 𝑗𝑐 ) π c log 1 𝜃 𝑗𝑐 1 𝜃 j ,

which ends in (3.76).

User profile picture
2021-03-24 13:42
Comments