Fully connected layer

At the top of the stack, a regular fully connected layer (also known as FNN or dense layer) is added; it acts similar to an MLP, which might be composed of a few fully connected layers (+ReLUs). The final layer outputs (for example, softmax) the prediction. An example is a softmax layer that outputs estimated class probabilities for a multiclass classification.

Fully connected layers connect every neuron in one layer to every neuron in another layer. Although fully connected FNNs can be used to learn features as well as classify data, it is not practical to apply this architecture to images.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset