Summary

In this chapter, we discussed different ways of performing approximate inference in graphical models, such as cluster graph belief propagation, propagation using approximate messages, and inference based on the concepts of sampling from the model. In cluster graph belief propagation, we relaxed the constraint of having a clique tree, and instead, performed belief propagation on the cluster graph. In the propagation using approximate messages, instead of relaxing the constraints on the structure of the graph, we tried to approximate the messages passed between the clusters. Next, we discussed sampling methods in detail. There are two different ways of sampling. The first includes full particles, where each sample has instantiations of all the variables of the network. The other method consists of collapsed particles, where each sample is an instantiation of a subset of the network's variables. We also discussed the problems we face in the case of full particles. In full particles, a very small part of the complete space is covered using each sample, due to which we need many more samples than in the case of collapsed particles. We also discussed the Markov chain Monte Carlo methods that are extensively used in practical problems.

In the next chapter, we will discuss parameter estimation in the case of Bayesian networks. This will help us create graphical models using the data we have.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.227.161.225