Recurrent neural networks for sequences

Another type of specialized deep neural network is called a recurrent neural network, which is particularly suited to sequential data. A variant of recurrent neural networks is referred to as long short-term memory networks (LSTMs for short) (Hochreiter and Schmidhuber, 1997). LSTM networks contain LSTM cells, which are cells that receive an input vector and produce an output vector. The LSTM cell is intricate and consists of various "gates" that regulate the output of the cell known as input gates, output gates, and forget gates. The gates, in turn, are partially controlled by the input at the previous time step. LSTMs have been the networks behind many of the successful results achieved in handwriting recognition, speech recognition, language translation, and image captioning (Goodfellow et al., 2016).

A recent study, again by Google, Inc., used deep learning architectures, including an LSTM architecture, to predict in-hospital mortality, unplanned readmission, prolonged length of stay, and final discharge diagnoses (Rajkomar et al., 2018). They achieved state-of-the-art results.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset