Weak learners

Weak learners are classifiers that are only slightly correlated with the actual classification; they can be somewhat better than the random predictions. On the contrary, strong learners are arbitrarily well correlated with the correct classification.

The idea here is that you don't use just one but a broad set of weak learners, each one slightly better than random. Many instances of the weak learners can be pooled using boosting, bagging, and so on together to create a strong ensemble classifier. The benefit is that the final classifier will not lead to overfitting on your training data.  

For example, AdaBoost fits a sequence of weak learners on different weighted training data. It starts by predicting the training dataset and gives equal weight to each observation/sample. If the first learner prediction is incorrect, then it gives higher weight to the observation/sample that has been mispredicted. Since it is an iterative process, it continues to add learners until a limit is reached in the number of models or accuracy.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.188.154.252