Optimizer configuration

The next step is configuring an optimizer. We can use a gradient descent optimizer for our task. There is a class called SteepestDescent in the Shark-ML library for this purpose. It can be configured with the setLearningRate and setMomentum methods. After its instantiation and configuration, the init method should be called with an object of the ErrorFunction type as its parameter.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.135.183.138