Summary

In this chapter, we took our first look at probability theory, learning about random variables and conditional probabilities, which allowed us to get a glimpse of Bayes' theorem—the underpinning of a Naive Bayes classifier. We talked about the difference between discrete and continuous random variables, likelihoods and probabilities, priors and evidence, and normal and Naive Bayes classifiers.

Finally, our theoretical knowledge would be of no use if we didn't apply it to practical examples. We obtained a dataset of raw email messages, parsed it, and trained Bayesian classifiers on it to classify emails as either spam or ham (not spam) using a variety of feature extraction approaches.

In the next chapter, we will switch gears and, for once, discuss what to do if we have to deal with unlabeled data.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.218.93.169