Summary

In this chapter, we've seen how perceptrons can be applied to solve linear separation problems and discussed their limitations with respect to the classification of nonlinear data. To suppress these limitations, we presented multilayer perceptrons (MLPs) and a new training algorithm called backpropagation. We've also seen some classes of problems that MLPs can be applied to, such as classification and regression. It's important to assimilate such concepts to understand their applications in the subsequent approaches. The Java implementation explored the power of the backpropagation algorithm with respect to updating the weights in both the output layer and the hidden layer. One practical application is shown to demonstrate the MLPs with respect to the solutions of the considered problems.

In the next chapter, we will explore the other learning paradigm of neural networks, unsupervised learning, that differs slightly from the learning algorithms that we've seen in this chapter; however, it can produce amazing results.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset