RNNs basic concepts

Human beings don't start thinking from scratch, human minds have the so-called persistence of memory, namely, the ability to associate the past with recent information. Traditional neural networks, instead, ignore past events. Taking as an example, a movie's scenes classifier, it's not possible that a neural network uses past scenes to classify the current ones.

Trying to solve this problem, RNNs have been developed, in contrast with the Convolutional Neural Networks (CNNs), the RNNs are networks with a loop that allows the information to be persistent.

RNNs process a sequential input one at a time, updating a kind of vector state that contains information about all past elements of the sequence.

The following figure shows a neural network that takes as input a value of Xt, and then outputs an Ot value:

An RNN with its internal loop

St is a network's vector state that can be considered a kind of memory of the system, which contains information on all the previous elements of the input sequence. On the other hand, the depicted cycle allows information to move from each step of the network to the next.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset