Homepage › Solution manuals › Kevin P. Murphy › Machine Learning: a Probabilistic Perspective › Exercise 20.4 - Inference in 2D lattice MRFs
Exercise 20.4 - Inference in 2D lattice MRFs
Answers
For question (a), we consider only two options:
- Eliminate by column, i.e., eliminate variables each round.
- Eliminate by row, i.e., eliminate variables each round.
Let denotes the collection of all hidden variables. Note that the local evidence can be absorbed into the local energy, what being broken is only the normalization condition. It is easy to see that the bottleneck in computing is the evidence:
The order by which we eliminate all hidden variables determines the efficiency of the overall algorithm.
For the first option, we begin by cancelling the last row, , this step involves altogether:
terms. In which are edges within this column, edges with the -th column and local evidence. To perform the integral, let us assume that a sampling-based paradigm is adopted, and each variable in and is sampled for times. This elimination ends up with a table with entries. Hence the integral is performed under complexity:
This procedure shall be iterated along the MRF for times, results in the overall complexity of:
Symmetrically, that for the second option is:
For question (b), with , we should adopt the first method.
For question (c), since , either method will do. We can adopt the Viterbi algorithm for HMM in this case, using a table of size to save the propogated information (as dynamic programming).