Implementing forward propagation

We can obtain the softmax cost function's outputs using cudnnSoftmaxForward() from the cuDNN library:

cudnnSoftmaxForward(cudnnHandle, CUDNN_SOFTMAX_ACCURATE, 
CUDNN_SOFTMAX_MODE_CHANNEL,
&one, input_desc, d_input, &zero, output_desc, d_output);

One of the most important parameter settings to use in this situation is CUDNN_SOFTMAX_MODE_CHANNEL. This option enables channel-level softmax operations following the input tensor descriptor information. By doing this, we can provide tensors that have been aligned by channels from mini-batch inputs from the dense layer.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.188.40.207