Summary

In this chapter, we covered basic principles of Bayesian inference. Starting with how uncertainty is treated differently in Bayesian statistics compared to classical statistics, we discussed deeply various components of Bayes' rule. Firstly, we learned the different types of prior distributions and how to choose the right one for your problem. Then we learned the estimation of posterior distribution using techniques such as MAP estimation, Laplace approximation, and MCMC simulations. Once the readers have comprehended this chapter, they will be in a position to apply Bayesian principles in their data analytics problems. Before we start discussing specific Bayesian machine learning problems, in the next chapter, we will review machine learning in general.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.26.138