Choosing a hidden layer architecture

So now that we understand the price and behavior of choosing too many parameters and conversely not enough parameters, where do we start? To the best of my knowledge, all that's left is experimentation. 

Measuring those experiments can be tricky. If your network trains quickly, like our early networks, then something like cross-validation can be implemented across a variety of architectures to evaluate multiple runs of each. If your network takes a long time to train, you might be left with something less statistically sophisticated. We will cover network optimization in Chapter 6, Hyperparameter Optimization.

Some books have offered a rule of thumb for choosing a neural network architecture. I remain skeptical and unconvinced of such claims and you certainly won't find one here.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.22.71.28