Depth-wise versus leaf-wise growth

LightGBM differs from XGBoost and CatBoost in how it prioritizes which nodes to split. LightGBM decides on splits leaf-wise, i.e., it splits the leaf node that maximizes the information gain, even when this leads to unbalanced trees. In contrast, XGBoost and CatBoost expand all nodes depth-wise and first split all nodes at a given depth before adding more levels. The two approaches expand nodes in a different order and will produce different results except for complete trees. The following diagram illustrates the two approaches:

LightGBM's leaf-wise splits tend to increase model complexity and may speed up convergence, but also increase the risk of overfitting. A tree grown depth-wise with n levels has up to 2n terminal nodes, whereas a leaf-wise tree with 2n leaves can have significantly more levels and contain correspondingly fewer samples in some leaves. Hence, tuning LightGBM's num_leaves setting requires extra caution, and the library allows us to control max_depth at the same time to avoid undue node imbalance. More recent versions of LightGBM also offer depth-wise tree growth.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.237.65.102