Homepage Solution manuals Kevin P. Murphy Machine Learning: a Probabilistic Perspective Exercise 21.3 - Variational lower bound for VB for univariate Gaussian

Exercise 21.3 - Variational lower bound for VB for univariate Gaussian

Answers

Recall that the variational lower bound for VB is defined by:

L ( q ) = 𝕂𝕃 ( q p ~ ) ,

where p ~ is the (unnormalized) posterior and q is the variational distribution. For the univariate Gaussian case, we already arrive in (21.84)-(21.87) and (21.92)-(21.97), thus we only have to fill the gaps (21.88)-(21.91), among which the last three equations have been illustrated in Chapter 2.

Here we derive (21.88). The Gamma distribution is an exponential family distribution:

Ga ( x | a , b ) = b a Γ ( a ) x a 1 exp { b x } exp { b x + ( a 1 ) ln x } = exp { ϕ ( x ) T 𝜃 } ,

whose sufficient statistics is ϕ ( x ) = ( x , ln x ) T and natural parameter is 𝜃 = ( b , a 1 ) T . Its cumulant function is:

A ( 𝜃 ) = log Z ( 𝜃 ) = log Γ ( a ) b a = log Γ ( a ) a log b .

Recall that the expectation of one sufficient statistics is given by the derivative of the cumulant function, therefore:

𝔼 [ ln x ] = ∂𝐴 ( a 1 ) = Γ ( a ) Γ ( a ) log b .

Finally, according to the defintion ψ ( a ) = Γ ( a ) Γ ( a ) , we arrive in:

𝔼 [ ln x ] = ψ ( a ) log b .

By using the property of the exponential family, we can get rid of tedious calculus. The rest steps have already been detailed in 21.5.1.6.

User profile picture
2021-03-24 13:42
Comments