Epochs and batch size

We'll choose 10 epochs for this example so that the code can be trained in less than an hour. It should be noted that 10 epochs will only get us to around 20% accuracy, so do not be alarmed if you find the resulting model does not appear accurate; you will need to train it for much longer, maybe even around 1,000 epochs. On a modern computer, an epoch takes around three minutes to complete; for the sake of not requiring three days to complete this example, we've chosen to abbreviate the training process and will leave it as an exercise to assess the results of more epochs, as shown here:

var (
epochs = flag.Int("epochs", 10, "Number of epochs to train for")
dataset = flag.String("dataset", "train", "Which dataset to train on? Valid options are "train" or "test"")
dtype = flag.String("dtype", "float64", "Which dtype to use")
batchsize = flag.Int("batchsize", 100, "Batch size")
cpuprofile = flag.String("cpuprofile", "", "CPU profiling")
)

Note that this model will consume a fairly large amount of memory; a batchsize of 100 can still mean you will need around 4 GB of memory. If you don't have this amount available without resorting to swapping memory, you may want to lower the batch size to make the code perform better on your computer.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
13.59.79.176