Using ReLU

In TensorFlow, the signature tf.nn.relu(features, name=None) computes a rectified linear using max(features, 0) and returns a tensor having the same type as features. Here is the parameter description:

  • features: A tensor. This must be one of the following types: float32, float64, int32, int64, uint8, int16, int8, uint16, and half.
  • name: A name for the operation (optional).

For more on how to use other activation functions, please refer to the TensorFlow website. Up to this point, we have the minimal theoretical knowledge to build our first CNN network for making a prediction.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset