The expectation-maximization algorithm

GMM uses the expectation-maximization algorithm to identify the components of the mixture of Gaussian distributions. The goal is to learn the probability distribution parameters from unlabeled data.

The algorithm proceeds iteratively as follows:

  1. Initialization—Assume random centroids (for example, using k-Means)
  2. Repeat the following steps until convergence (that is, changes in assignments drop below the threshold):
    • Expectation step: Soft assignmentcompute probabilities for each point from each distribution
    • Maximization step: Adjust normal-distribution parameters to make data points most likely

The following screenshot shows the GMM cluster membership probabilities for the Iris dataset as contour lines:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
44.202.128.177