Gradient-boosting machine

For illustration  purposes, we also train a LightGBM gradient-boosting tree ensemble with default settings and the multiclass objective:

param = {'objective':'multiclass', 'num_class': 5}
booster = lgb.train(params=param,
train_set=lgb_train,
num_boost_round=500,
early_stopping_rounds=20,
valid_sets=[lgb_train, lgb_test])

The basic settings do not improve on multinomial logistic regression, but further parameter tuning remains an unused option:

y_pred_class = booster.predict(test_dtm_numeric.astype(float))
accuracy_score(test.stars, y_pred_class.argmax(1) + 1)
0.738665855696524
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.104.248