Bagging

Bagging stands for bootstrap aggregation. Hence, it's clear that the bagging concept stems from bootstrapping. It implies that bagging has got the elements of bootstrapping. It is a bootstrap ensemble method wherein multiple classifiers (typically from the same algorithm) are trained on the samples that are drawn randomly with replacements (bootstrap samples) from the training set/population. Aggregation of all the classifiers takes place in the form of average or by voting. It tries to reduce the affect of the overfitting issue in the model as shown in the following diagram:

There are three stages of bagging:

  • Bootstrapping: This is a statistical technique that's used to generate random samples or bootstrap samples with replacement.
  • Model fitting: In this stage, we build models on bootstrap samples. Typically, the same algorithm is used for building the models. However, there is no restriction on using different algorithms.
  • Combining models: This step involves combining all the models and taking an average. For example, if we have applied a decision tree classifier, then the probability that's coming out of every classifier is averaged.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.174.253