Homepage › Solution manuals › Kevin P. Murphy › Machine Learning: a Probabilistic Perspective › Exercise 2.13 - Mutual information for correlated normals
Exercise 2.13 - Mutual information for correlated normals
Answers
We have:
Here we incorporate a comprehensive deduction on (2.138) and (2.139), which shall not be taken for granted. The differential entropy for a 1D-Gaussian with density function:
is
For the multi-dimensional case, we begin by diagonalizing the covariance matrix/decoupling the components and integrating along each independent component. Under this new set of coordinates , the logarithm of the density can be decomposed into
where is the -th diagonal component in the transformed covariance matrix. The product of all diagonal components is exactly , hence proving (2.138).