Algorithm

Now we'll see how our algorithm works step by step:

  1. Let's say we've a task distribution . First, we randomly initialize our model parameters such as a parameter of concept generator , meta learner , and concept discriminator .
  2. We sample a batch of tasks from the task distributions and learn their concepts via the concept generator, perform meta learning on those concepts, and then compute the meta learning loss:

  1. We sample some data points from our external dataset , feed them to the concept generator to learn their concept, feed those concepts to the concept discriminator, which classifies them, and then we compute the concept discrimination loss:

  1. We combine both of these losses and try to minimize the loss using SGD and get the updated model parameters: .
  2. Repeat steps 2 to 4 for n number of iterations.

Congratulations again for learning all of the important and popular meta learning algorithms. Meta learning is an interesting and most promising field of AI that will take us closer toward Artificial General Intelligence (AGI). Now that you've finished reading this book, you can start exploring various advancements in meta learning and start experimenting with various projects. Learn and meta learn!

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.82.154