Chapter 9. Neural Network Optimization and Adaptation

In this chapter, the reader will be presented with techniques that help to optimize neural networks, thereby favoring its best performance. Tasks such as input selection, dataset separation and filtering, and choice of the number of hidden neurons are examples of what can be adjusted to improve a neural network's performance. Furthermore, this chapter focuses on methods for adapting neural networks to real-time data. Two implementations of these techniques are presented here. Application problems will be selected for exercises. This chapter deals with the following:

  • Input selection
    • Dimensionality reduction
    • Data filtering
  • Structure selection
    • Pruning
  • Online retraining
    • Stochastic online learning
  • Adaptive neural networks
    • Adaptive resonance theory

Common issues in neural network implementations

When developing a neural network application, it is quite common to face problems regarding how accurate the results are. The source of these problems can be various:

  • bad input selection
  • noisy data
  • very big dataset
  • unsuitable structure
  • inadequate number of hidden neurons
  • inadequate learning rate
  • insufficient stop condition; and/or
  • bad dataset segmentation

The design of a neural network application sometimes requires a lot of patience and trial-and-error methods. There is no methodology stating specifically the number of hidden units and/or which architecture should be used, but there are recommendations on how to properly choose these parameters. Another issue that programmers may face is a long training time, which often causes the neural network to not learn the data. No matter how long the training runs, the neural network won't converge.

Tip

Designing a neural network requires the programmer or designer to test and redesign the neural structure as many times as needed, until an acceptable result is obtained.

On the other hand, one may wish to improve the results. A neural network can learn until the learning algorithm reaches the stop condition, either the number of epochs or the mean squared error. Even so, sometimes, the results are either inaccurate or not generalized. This will require a redesign of the neural structure as well as the dataset.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset