Convolution shift

The next step is called convolution shift. It is used for moving the head position. That is, it is used for shifting the focus from one location to the another. Each head emits a parameter called a shift weight , which give us a distribution over which allowable integer shifts are performed. For example, let's say we have shifts between -1 and 1 that are allowed, then a length of would become three, comprising {-1,0,1}.

So, what exactly do these shifts mean? Let's say we have three elements in our weight vector, —that is, , and we have three elements in the shift weight vectorthat is, .

A shift of -1 means that we will shift the element in from left to right. A shift of 0 keeps the element at the same position and a shift of +1 means that we will shift the element from right to left. This can be seen in the following diagram:

Now, look at the following diagram where we have shift weights , meaning that we perform a left shift, as shift values are 0 at other positions:

Similarly, when we perform a right shift, as shift values are 0 at other positions, as shown in the following diagram:

So, in this way, we perform convolution shifts over elements in the weight matrix. If we 0 to N-1 memory locations, then we can express our convolution shift as follows:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.133.127.37