Finding separating boundary with support vector machines

After introducing a powerful, yet simple classifier Naïve Bayes, we will continue with another great classifier that is popular for text classification, the support vector machine (SVM).

In machine learning classification, SVM finds an optimal hyperplane that best segregates observations from different classes. A hyperplane is a plane of n -1 dimension that separates the n dimensional feature space of the observations into two spaces. For example, the hyperplane in a two-dimensional feature space is a line, and a surface in a three-dimensional feature space. The optimal hyperplane is picked so that the distance from its nearest points in each space to itself is maximized. And these nearest points are the so-called support vectors. The following toy example demonstrates what support vector and a separating hyperplane (along with the distance margin which we will explain later) look like in a binary classification case:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.17.176.72