Pros and cons

There is so much information that can be crammed into one chapter. The examples selected in this chapter do not do justice to the versatility and accuracy of the Naïve Bayes family of classifiers.

The Naïve Bayes algorithm is a simple and robust generative classifier that relies on prior conditional probabilities to extract a model from a training dataset. The Naïve Bayes model has its benefits, as mentioned here:

  • It is easy to implement and parallelize
  • It has a very low computational complexity: O((n+c)*m), where m is the number of features, c is the number of classes, and n is the number of observations
  • It handles missing data
  • It supports incremental updates, insertions, and deletions

However, Naïve Bayes is not a silver bullet. It has the following disadvantages:

  • It requires a large training set to achieve reasonable accuracy
  • The assumption of the independence of features is not practical in the real world
  • It requires dealing with the zero-frequency problem for counters
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.21.104.183