Dropout

A neural network can be thought of as a search problem. Each node in the neural network is searching for correlation between the input data and the correct output data.

Dropout randomly turns nodes off while forward-propagating and thus helps ward off weights from converging to identical positions. After this is done, it turns on all the nodes and back-propagates. Similarly, we can set some of the layer's values to zero at random during forward propagation in order to perform dropout on a layer.

Use dropout only during training. Do not use it at runtime or on your testing dataset.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset