Learning to use Naïve Bayes classifiers

Learning to use examples could be hard even for humans. For example, given a list of examples for two sets of values, it's not always easy to see the connection between them. One way of solving this problem would be to classify one set of values and then give it a try, and that's where classifier algorithms come in handy.

Naïve Bayes classifiers are prediction algorithms for assigning labels to problem instances; they apply probability and Bayes' theorem with a strong-independence assumption between the variables to analyze. One of the key advantages of Bayes' classifiers is scalability.

Getting ready…

Since it is hard to build a general classifier, we will build ours assuming that the inputs are positive- and negative-labeled examples. So, the first thing that we need to address is defining the labels that our classifier will handle using an enum data structure called NBCLabel:

public enum NBCLabel
{
    POSITIVE,
    NEGATIVE
}

How to do it…

The classifier we'll build only takes five great steps:

  1. Create the class and its member variables:
    using UnityEngine;
    using System.Collections;
    using System.Collections.Generic;
    
    public class NaiveBayesClassifier : MonoBehaviour
    {
        public int numAttributes;
        public int numExamplesPositive;
        public int numExamplesNegative;
    
        public List<bool> attrCountPositive;
        public List<bool> attrCountNegative;
    }
  2. Define the Awake method for initialization:
    void Awake()
    {
        attrCountPositive = new List<bool>();
        attrCountNegative = new List<bool>();
    }
  3. Implement the function for updating the classifier:
    public void UpdateClassifier(bool[] attributes, NBCLabel label)
    {
        if (label == NBCLabel.POSITIVE)
        {
            numExamplesPositive++;
            attrCountPositive.AddRange(attributes);
        }
        else
        {
            numExamplesNegative++;
            attrCountNegative.AddRange(attributes);
        }
    }
  4. Define the function for computing the Naïve probability:
    public float NaiveProbabilities(
            ref bool[] attributes,
            bool[] counts,
            float m,
            float n)
    {
        float prior = m / (m + n);
        float p = 1f;
        int i = 0;
        for (i = 0; i < numAttributes; i++)
        {
            p /= m;
            if (attributes[i] == true)
                p *= counts[i].GetHashCode();
            else
                p *= m - counts[i].GetHashCode();
        }
        return prior * p;
    }
  5. Finally, implement the function for prediction:
    public bool Predict(bool[] attributes)
    {
        float nep = numExamplesPositive;
        float nen = numExamplesNegative;
        float x = NaiveProbabilities(ref attributes, attrCountPositive.ToArray(), nep, nen);
        float y = NaiveProbabilities(ref attributes, attrCountNegative.ToArray(), nen, nep);
        if (x >= y)
            return true;
        return false;
    }

How it works…

The UpdateClassifier function takes the example input values and stores them. This is the first function to be called. The NaiveProbabilities function is the one responsible for computing the probabilities for the prediction function to work. Finally, the Predict function is the second one to be called by us in order to get the results of classification.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.75.217