Visualizing a broken network

TensorBoard is a great troubleshooting tool. To demonstrate this, I'm going to copy our deep neural network and break it! Luckily, breaking a neural network is really easy. Trust me, I've done it enough unintentionally that I'm basically an expert at this point.

Imagine that you have just trained a new neural network and seen that the loss looked like this:

The loss function for this network is stuck, and it's way higher than our previous run. What went wrong?

Navigate to the HISTOGRAMS section of TensorBoard and visualize the first hidden layer. Let's compare the histogram of the weights for hidden layer 1 in both networks: 

Sceenshot displaying the histogram of the weights for hidden layer 1 in both networks

For both the biases and weights of the network labelled dnn, you'll see that the weights are spread out across the graph. You might even say that the distribution of each could be normal(ish).

You can also compare the weights and biases in the distributions section. Both present mostly the same information in slightly different ways.

Now, look at the weight and biases of our broken network. Not so spread out, and in fact, the weights are all basically the same. The network isn't really learning. Every neuron in the layer appears to be more or less the same. If you look at the other hidden layers you'll see more of the same.

You might be wondering what I did to make this happen. You're in luck, I'll share my secret. After all, you never know when you might need to break your own network. To break things, I initialized every neuron in the network to the exact same value. When this happens, the error every neuron receives during backprop is exactly the same and changes exactly the same way. The network then fails to break symmetry. Initializing the weights to a deep neural network in a random way is really important, and this is what happens if you break that rule!

You can use TensorBoard exactly like this when you have a problem. Keep in mind our deep neural network has 4033, and that still qualifies as tiny in the world of deep learning. With TensorBoard, we were able to visually inspect 4033 parameters and identify a problem. TensorBoard is an amazing flashlight in the dark room that is deep learning.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.117.231.15