Summary

In this chapter, we studied ensemble learning and its different methods, namely bagging, boosting, and stacking. We even saw what is bootstrapping which is the root for ensemble learning methods such as bagging and boosting. We also learned about decision trees and its approach of divide and rule with example of people applying for loan. Then we covered tree splitting and the parameters to split a decision tree, moving on to the random forest algorithm. We worked on a case study of breast cancer using the concepts covered. We also discovered the difference between bagging and boosting and gradient boosting. We also discussed on parameters of gradient boosting to use it our example of breast cancer.

In the next chapter, we will learn about training neural networks.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.80.100