One-versus-all logistic regression

We proceed to train a one-versus-all logistic regression that trains one model per class, while treating the remaining classes as the negative class, and predicts probabilities for each class using the different models.

Using only text features, we train and evaluate the model as follows:

logreg = LogisticRegression(C=1e9)
logreg.fit(X=train_dtm, y=train.stars)
y_pred_class = logreg.predict(test_dtm)

The model achieves significantly higher accuracy at 73.6%:

accuracy_score(test.stars, y_pred_class)
0.7360498864740219
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.140.185.147