Creating Ensembles and Multiclass Methods

"This is how you win ML competitions: you take other people's work and ensemble them together."
- Vitaly Kuznetsov, NIPS2014

You may have already realized that we've discussed ensemble learning. It's defined on www.scholarpedia.org as the process by which multiple models, such as classifiers or experts, are strategically generated and combined to solve a particular computational intelligence problem. In random forest and gradient boosting, we combined the votes of hundreds or thousands of trees to make a prediction. Hence, by definition, those models are ensembles. This methodology can be extended to any learner to create ensembles, which some refer to as meta-ensembles or meta-learners. We'll look at one of these methods referred to as stacking. In this methodology, we'll produce a number of classifiers and use their predicted class probabilities as input features to another classifier. This method can result in improved predictive accuracy. In the previous chapters, we focused on classification problems focused on binary outcomes. We'll now look at methods to predict those situations where the data consists of more than two outcomes (multiclass), a very common situation in real-world datasets.

The following are the topics that will be covered in this chapter:

  • Ensembles
  • Data understanding
  • Modeling and evaluation
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.139.80.209