Recurrent dropout

Having read this far into the book, you've already encountered the concept of dropout. Dropout removes some elements of one layer of input at random. A common and important tool in RNNs is a recurrent dropout, which does not remove any inputs between layers but inputs between time steps:

Recurrent dropout

Recurrent dropout scheme

Just as with regular dropout, recurrent dropout has a regularizing effect and can prevent overfitting. It's used in Keras by simply passing an argument to the LSTM or RNN layer.

As we can see in the following code, recurrent dropout, unlike regular dropout, does not have its own layer:

model = Sequential()
model.add(LSTM(16, recurrent_dropout=0.1,return_sequences=True,input_shape=(max_len,n_features)))

model.add(LSTM(16,recurrent_dropout=0.1))

model.add(Dense(1))
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset