In machine learning, hyperparameters refer to the parts of the algorithm you set before the algorithm starts training. Internally, the algorithm learns parameters that help it perform a task. Externally, the programmer controls parameters that dictate how the algorithm trains.
In the context of genetic algorithms, hyperparameters refer to things like population size, mutation rate, and so on, that you choose before running the algorithm.
Because your hyperparameters can have a huge impact on the outcome of your algorithms, it’s important that you’re able to rapidly change them. To ensure you can change hyperparameters without too much of a headache, you need to implement a simple configuration mechanism into your framework that separates the hyperparameters from the overall structure of the algorithm.
To start, change the signature of both run/3 and evolve/4 to accept an additional parameter:
| def run(genotype, fitness_function, max_fitness, opts \ []) do |
| # ...omitted... |
| end |
| def evolve(population, fitness_function, max_fitness, opts \ []) do |
| # ...omitted... |
| end |
opts \ [] indicates an optional parameter that will default to an empty list if you pass nothing in its place. You can use opts to pass hyperparameters in a Keyword list. Using a parameter like opts is a common paradigm for Elixir programs.
After you add opts to the signatures of run, you need to edit all of your functions to accept opts. Change the function signatures of all of the functions in genetic.ex to accept an optional opts parameter, like this:
| def initialize(genotype, opts \ []) do |
| # ...omitted... |
| end |
| |
| def evaluate(population, fitness_function, opts \ []) do |
| # ...omitted... |
| end |
| |
| def select(population, opts \ []) do |
| # ...omitted... |
| end |
| |
| def crossover(population, opts \ []) do |
| # ...omitted... |
| end |
| |
| def mutation(population, opts \ []) do |
| # ...omitted... |
| end |
Finally, pass opts to every function in run and evolve:
| def run(genotype, fitness_function, max_fitness, opts \ []) do |
| population = initialize(genotype) |
| population |
| |> evolve(fitness_function, max_fitness, opts) |
| end |
| def evolve(population, fitness_function, max_fitness, opts \ []) do |
| population = evaluate(population, fitness_function, opts) |
| best = hd(population) |
| IO.write(" Current Best: #{fitness_function.(best)}") |
| if fitness_function.(best) == max_fitness do |
| best |
| else |
| population |
| |> select(opts) |
| |> crossover(opts) |
| |> mutation(opts) |
| |> evolve(fitness_function, max_fitness, opts) |
| end |
| end |
For now, the only hyperparameter you’ll account for is population size. To do this, edit initialize/2 to look like this:
| def initialize(genotype, opts \ []) do |
| population_size = Keyword.get(opts, :population_size, 100) |
| for _ <- 1..population_size, do: genotype.() |
| end |
Keyword.get/3 accepts a Keyword, a key, and a default value if there’s no value for the given key. Here you set the default population size to 100, which is sufficient for most genetic algorithms.
In later chapters, you’ll be introduced to more hyperparameters and learn to account for them so your algorithms are easily configurable.
3.22.119.251