Training the neural network

Our main looks like this so far:

func main() {
imgs, err := readImageFile(os.Open("train-images-idx3-ubyte"))
if err != nil {
log.Fatal(err)
}
labels, err := readLabelFile(os.Open("train-labels-idx1-ubyte"))
if err != nil {
log.Fatal(err)
}

log.Printf("len imgs %d", len(imgs))
data := prepareX(imgs)
lbl := prepareY(labels)
visualize(data, 10, 10, "image.png")

data2, err := zca(data)
if err != nil {
log.Fatal(err)
}
visualize(data2, 10, 10, "image2.png")

nat, err := native.MatrixF64(data2.(*tensor.Dense))
if err != nil {
log.Fatal(err)
}

log.Printf("Start Training")
nn := New(784, 100, 10)
costs := make([]float64, 0, data2.Shape()[0])
for e := 0; e < 5; e++ {
data2Shape := data2.Shape()
var oneimg, onelabel tensor.Tensor
for i := 0; i < data2Shape[0]; i++ {
if oneimg, err = data2.Slice(makeRS(i, i+1)); err != nil {
log.Fatalf("Unable to slice one image %d", i)
}
if onelabel, err = lbl.Slice(makeRS(i, i+1)); err != nil {
log.Fatalf("Unable to slice one label %d", i)
}
var cost float64
if cost, err = nn.Train(oneimg, onelabel, 0.1); err != nil {
log.Fatalf("Training error: %+v", err)
}
costs = append(costs, cost)
}
log.Printf("%d %v", e, avg(costs))
shuffleX(nat)
costs = costs[:0]
}
log.Printf("End training")
}

Here are the steps in brief:

  1. Load image files.
  2. Load label files.
  3. Convert image files into *tensor.Dense.
  4. Convert label files into *tensor.Dense.
  5. Visualize 100 of the images.
  6. Perform ZCA whitening on the images.
  7. Visualize the whitened images.
  8. Create a native iterator for the dataset.
  9. Create the neural network with a 100 unit hidden layer.
  10. Create a slice of the costs. This is so we can keep track of the average cost over time.
  11. Within each epoch, slice the input into single image slices.
  12. Within each epoch, slice the output labels into single slices.
  13. Within each epoch, call nn.Train() with a learn rate of 0.1 and use the sliced single image and single labels as a training example.
  14. Train for five epochs.

How would we know that the neural network has learned well? One way is to monitor the costs. If the neural network is learning, the average costs over time will drop. There may be bumps, of course, but the overall big picture should be that the cost does not end up higher than when the program first runs.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.101.81