Activation functions

The activation ops provide different types of nonlinearities for use in neural networks. These include smooth nonlinearities, such as sigmoid, tanh, elu, softplus, and softsign. On the other hand, some continuous but not-everywhere-differentiable functions that can be used are relu, relu6, crelu, and relu_x. All activation ops apply component-wise and produce a tensor of the same shape as the input tensor. Now let us see how to use a few commonly used activation functions in TensorFlow syntax.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset