ReLU

Rectified linear unit (ReLU) does not tolerate negative values as it accepts a real-valued input and thresholds it at zero (replaces negative values with zero):

f(x) = max(0, x)
   
Figure 6: Relu activation function

Importance of bias: The main function of bias is to provide every node with a trainable constant value (in addition to the normal inputs that the node receives). See this link at https://stackoverflow.com/questions/2480650/role-of-bias-in-neural-networks to learn more about the role of bias in a neuron.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset