Homepage Solution manuals Kevin P. Murphy Machine Learning: a Probabilistic Perspective Exercise 24.7 - Gibbs sampling for logistic regression with the Student approximation

Exercise 24.7 - Gibbs sampling for logistic regression with the Student approximation

Answers

The complete likelihood with (24.88)-(24.91) is:

p ( λ , 𝐙 , 𝐰 , 𝐘 | 𝐗 , v ) = p ( λ | v ) p ( 𝐰 ) p ( 𝐙 | 𝐗 , 𝐰 , λ ) p ( 𝐘 | 𝐙 ) = n = 1 N Ga ( λ n | v 2 , v 2 ) p ( 𝐰 ) n = 1 N 𝒩 ( z n | 𝐰 T 𝐱 n , λ n 1 ) n = 1 N 𝕀 [ y n z n ] .

For λ , we have:

p ( λ n | ) λ n v 2 1 exp { v λ n 2 } λ n 1 2 exp { λ n ( z n 𝐰 T 𝐱 n ) 2 2 } .

So the conditional posterior is a Gamma distribution with parameters:

v + 1 2 ,

v 2 + ( z n 𝐰 T 𝐱 n ) 2 2 .

For 𝐙 , we have:

p ( z n | ) exp { λ n ( z n 𝐰 T 𝐱 n ) 2 2 } ,

s.t. y n z n , therefore z n follows a truncated Gaussian.

Finally, 𝐰 follows a Gaussian, as in exercise 24.5 and 24.6.

User profile picture
2021-03-24 13:42
Comments