What did this book not cover?

There are a number of things that we can explore in Go. Here's a non-exhaustive list of some things you may want to explore:

  • Random trees and random forests
  • Support vector machines
  • Gradient-boosting methods
  • Maximum-entropy methods
  • Graphical methods
  • Local outlier factors

Perhaps if there is a second edition to this book, I will cover them. If you are familiar with machine learning methods, you may note that these, especially the first three, are perhaps some of the highest-performing machine learning methods, when compared with the things written in this book. You might wonder why they were not included. The schools of thought that these methods belong to might supply a clue.

For example, random trees and random forests can be considered pseudo-Symbolist—they're a distant cousin of the Symbolist school of thought, originating from decision trees. Support vector machines are analogizers. Maximum entropy and graphical methods are of the Bayesian school of thought.

This book is biased toward the Connectionist school of thought for a good reason: deep learning is popular right now. If the winds of favor had been different, this book would have been markedly different. There is also the issue of explainability. I can explain support vector machines quite well, but it would consist of pages and pages of mathematical analogy. Opting not to explain how SVMs work, on the other hand, would lead to a very thin chapter—the standard implementation of SVMs is to use libsvm or svmlight. Simply call the functions provided by the library and the job's done! So an explanation of SVMs is warranted.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
52.15.42.128