Loss function Weighting

We can also work with imbalanced classes and handle the classification problem for imbalanced data by including weights to the loss function. Such penalties or weights force model to focus more on the minority classes (class with fewer samples). Examples of this is penalized-SVM and Focal Loss detector algorithm discussed in previous chapters.

Tensorflow already has its loss functions with weighting options built-in:

  • tf.losses.sparse_softmax_cross_entropy(labels=label, logits=logits, weights=weights)
  • Tf.nn.weighted_cross_entropy_with_logits

For example, if you are trying to classify three classes A, B, C, where A is 10%, B is 45%, and C is 45%, you can use tf.losses.sparse_softmax_cross_entropy with the following weights: [1.0, 0.3, 0.3].

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.216.255.250