Building Meta-SGD from scratch

In the last section, we saw how Meta-SGD works. We saw how Meta-SGD obtains a better and robust model parameter that's generalizable across tasks along with optimal learning rate and update direction. Now, we'll better understand Meta-SGD by coding them from scratch. Like we did in MAML, for better understanding, we'll consider a simple binary classification task. We randomly generate our input data and we train it with a simple single layer neural network and try to find the optimal parameter . We'll see step-by-step exactly how to do this.

You can also check the code available as a Jupyter Notebook with an explanation here: https://github.com/sudharsan13296/Hands-On-Meta-Learning-With-Python/blob/master/07.%20Meta-SGD%20and%20Reptile%20Algorithms/7.4%20Building%20Meta-SGD%20from%20Scratch.ipynb.

First, we import the numpy library:

import numpy as np
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
13.58.44.229