From images to a matrix

So far we've established that we can convert a list of images in a special format in to a slice of slices of float64. Recall from earlier, that when you stack neurons together they form a matrix, and the activation of a neural layer is simply a matrix-vector multiplication. And when the inputs are stacked together, it's simply matrix-matrix multiplication.

We technically can build a neural network with just [][]float64. But the end result will be quite slow. Collectively as a species, we have had approximately 40 years of developing algorithms for efficient linear algebra operations, such as matrix multiplication and matrix-vector multiplication. This collection of algorithms are generally known as BLAS (Basic Linear Algebra Subprograms).

We have been, up to this point in the book, using libraries built on top of a library that provide BLAS functions, namely  Gonum's BLAS library. If you had been following the book up to this point, you would have it installed already. Otherwise, run go get -u gonum.org/v1/gonum/..., which would install the entire suite of Gonum libraries.

Because of the way BLAS works in general, we need a better way of representing matrices than [][]float64. Here we have two options:

  • Gonum's mat library
  • Gorgonia's tensor library

Why Gorgonia's tensor? The reason for tensor is quite simple. It plays well with Gorgonia itself, which requires multidimensional arrays. Gonum's mat only takes up to two dimensions, while in the next chapter we'll see a use of four-dimensional arrays. 

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.143.205.136