Predicting Employee Attrition Using Ensemble Models

If you reviewed the recent machine learning competitions, one key observation I am sure you would make is that the recipes of all three winning entries in most of the competitions include very good feature engineering, along with well-tuned ensemble models. One conclusion I derive from this observation is that good feature engineering and building well-performing models are two areas that should be given equal emphasis in order to deliver successful machine learning solutions.

While feature engineering most times is something that is dependent on the creativity and domain expertise of the person building the model, building a well-performing model is something that can be achieved through a philosophy called ensembling. Machine learning practitioners often use ensembling techniques to beat the performance benchmarks yielded by even the best performing individual ML algorithm. In this chapter, we will learn about the following topics of this exciting area of ML:

  • Philosophy behind ensembling 
  • Understanding the attrition problem and the dataset
  • K-nearest neighbors model for benchmarking the performance
  • Bagging
  • Randomization with random forests
  • Boosting
  • Stacking
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.129.210.102