Overfitting and underfitting

Understanding overfitting and underfitting is the key to building successful machine learning and deep learning models. At the start of the chapter, we briefly covered what underfitting and overfitting are; let's take a look at them in detail and how we can solve them. 

Overfitting, or not generalizing, is a common problem in machine learning and deep learning. We say a particular algorithm overfits when it performs well on the training dataset but fails to perform on unseen or validation and test datasets. This mostly occurs due to the algorithm identifying patterns that are too specific to the training dataset. In simpler words, we can say that the algorithm figures out a way to memorize the dataset so that it performs really well on the training dataset and fails to perform on the unseen data. There are different techniques that can be used to avoid the algorithm overfitting. Some of the techniques are:

  • Getting more data
  • Reducing the size of the network
  • Applying weight regularizer
  • Applying dropout 
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.15.43