Summary

PGMs capture domain knowledge as relationships between variables and represent joint probabilities. They are used in a range of applications.

Probability maps an event to a real value between 0 and 1 and can be interpreted as a measure of the frequency of occurrence (frequentist view) or as a degree of belief in that occurrence (Bayesian view). Concepts of random variables, conditional probabilities, Bayes' theorem, chain rule, marginal and conditional independence and factors form the foundations to understanding PGMs. MAP and Marginal Map queries are ways to ask questions about the variables and relationships in the graph.

The structure of graphs and their properties such as paths, trails, cycles, sub-graphs, and cliques are vital to the understanding of Bayesian networks. Representation, Inference, and Learning form the core elements of networks that help us capture, extract, and make predictions using these methods. From the representation of graphs, we can reason about the flow of influence and detect independencies that help reduce the computational load when querying the model. Junction trees, variable elimination, and belief propagation methods likewise make inference from queries more tractable by reductive steps. Learning from Bayesian networks involves generating the structure and model parameters from the data. We discussed several methods of learning parameters and structure.

Markov networks (MN), which have undirected edges, also contain interactions that can be captured using parameterization techniques such as Gibbs parameterization, Factor Graphs, and Log-Linear Models. Independencies in MN govern flows of influence, as in Bayesian networks. Inference techniques are also similar. Learning of parameters and structure in MN is hard, and approximate methods are used. Specialized networks such as Tree augmented networks (TAN) make assumptions of independence amongst nodes and are very useful in some applications. Markov Chains and hidden Markov models are other specialty networks that also find application in a range of fields.

Open Markov and Weka Bayesian Network GUI are introduced as Java-based tools for PGMs. The case study in this chapter used Bayesian Networks to learn from the UCI Adult census dataset and its performance was compared to other (non-PGM) classifiers.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.219.228.88