Fully connected layer

This type of layer is exactly the same as any of the layers of a classical ANN with fully connected (FC) architecture. Simply in an FC layer, each neuron is connected to all the neurons of the previous layer, specifically to their activations.

This type of layer, unlike what has been seen so far in CNNs, does not use the property of local connectivity. An FC layer is connected to the entire input volume, and, therefore, as you can imagine, there will be many connections. The only settable parameter of this type of layer is the number of K neurons that make it up. What basically defines an FC layer is as follows: connecting its K neurons with all the input volume and calculating the activation of each of its K neurons.

In fact, its output will be a single 1 x 1 x K vector, containing the calculated activations. The fact that after using a single FC layer you switch from an input volume (organized in three dimensions) to a single output vector (in a single dimension) suggests that after applying an FC layer, no more  convoluted layers can be used. The main function of FC layers in the context of CNNs is to carry out a sort of grouping of the information obtained up to that moment, expressing it with a single number (the activation of one of its neurons), which will be used in subsequent calculations for the final classification.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.149.238.159