Problems and solutions to gradients in RNN

RNNs are not perfect, there are two main issues namely exploding gradients and vanishing gradients that they suffer from. To understand the issues, let's first understand what a gradient means. A gradient is a partial derivative with respect to its inputs. In simple layman's terms, a gradient measures how much the output of a function changes, if one were to change the inputs a little bit.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.138.106.233