In the backpropagation algorithm recipe, we defined layers, weights, loss, gradients, and update through gradients manually. It is a good idea to do it manually with equations for better understanding but this can be quite cumbersome as the number of layers in the network increases.
In this recipe, we will use powerful TensorFlow features such as Contrib (Layers) to define neural network layers and TensorFlow's own optimizer to compute and apply gradients. We saw in Chapter 2, Regression, how to use different TensorFlow optimizers. The contrib can be used to add various layers to the neural network model like adding building blocks. The one method that we use here is tf.contrib.layers.fully_connected, defined in TensorFlow documentation as follows:
fully_connected(
inputs,
num_outputs,
activation_fn=tf.nn.relu,
normalizer_fn=None,
normalizer_params=None,
weights_initializer=initializers.xavier_initializer(),
weights_regularizer=None,
biases_initializer=tf.zeros_initializer(),
biases_regularizer=None,
reuse=None,
variables_collections=None,
outputs_collections=None,
trainable=True,
scope=None
)
This adds a fully connected layer.