Building MAML from scratch

In the last section, we saw how MAML works. We saw how MAML obtains a better and robust model parameter θ that can be generalizable across tasks. Now we will better understand MAML by coding it from scratch. For better understanding, we will consider a simple binary classification task. We randomly generate our input data and we train it with a simple single layer neural network and try to find the optimal parameter θ. Now, we will see, step-by-step, exactly how to do this:

You can also check the code available as a Jupyter Notebook with an explanation here: https://github.com/sudharsan13296/Hands-On-Meta-Learning-With-Python/blob/master/06.%20MAML%20and%20it's%20Variants/6.5%20Building%20MAML%20From%20Scratch.ipynb.

First, we import the numpy library:

import numpy as np
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.179.35