Validation curves with yellowbricks

Validation curves (see the left-hand panel in the following graph) visualize the impact of a single hyperparameter on a model's cross-validation performance. This is useful to determine whether the model underfits or overfits the given dataset.

In our example of the KNeighborsRegressor that only has a single hyperparameter, we can clearly see that the model underfits for values of k above 20, where the validation error drops as we reduce the number of neighbors, thereby making our model more complex, because it makes predictions for more distinct groups of neighbors or areas in the feature space. For values below 20, the model begins to overfit as training and validation errors diverge and average out-of-sample performance quickly deteriorates, as shown in the following graph:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
34.204.52.16