Homepage Solution manuals Kevin P. Murphy Machine Learning: a Probabilistic Perspective Exercise 24.5 - Gibbs sampling for robust linear regression with a Student t likelihood

Exercise 24.5 - Gibbs sampling for robust linear regression with a Student t likelihood

Answers

Recall that in robust linear regression, each sample y i is assigned an extra latente variable z i as the estimation of the noise. The complete likelihood is:

p ( 𝐘 , 𝐗 , 𝐙 , 𝐰 , σ 2 , v ) = p ( 𝐗 , 𝐰 , σ 2 , v ) p ( 𝐙 | v ) p ( 𝐘 | 𝐙 , 𝐗 , 𝐰 , σ 2 ) p ( σ 2 ) ( n = 1 N z n v 2 1 e z n v 2 ) ( n = 1 N ( σ 2 ) 1 2 z n 1 2 e z n ( y n 𝐰 T 𝐱 n ) 2 2 σ 2 ) .

From which we can read that the conditional distribution for z n takes the form:

z n v + 1 2 1 e z n ( v 2 + ( y n 𝐰 T 𝐱 n ) 2 2 σ 2 ) ,

hence is again a Gamma distribution.

For the variance σ 2 , its conditional posterior is:

p ( σ 2 ) ( n = 1 N ( σ 2 ) 1 2 z n 1 2 e z n ( y n 𝐰 T 𝐱 n ) 2 2 σ 2 ) ,

which can be absorbed into the prior with an Inverse-Gamma form.

Finally, for the weight 𝐰 , the conditional posterior takes the Gaussian form, so using a Gaussian conjugate prior is sufficient for Gibbs sampling update.

User profile picture
2021-03-24 13:42
Comments