In Brazil, one of the ways for a person to enter university is taking an exam and if he/she achieves the minimum grade required for the course that he/she is seeking, then he/she can enroll. To demonstrate the backpropagation algorithm, let us consider this scenario. Data showed in the following table was collected from a university database. The second column represents the person's gender (one means female, and zero means male); the third column has grades scaled by 100, and the last column is formed by two neurons (1,0 means performed enrollment, and 0,1 means waiver enrollment.
Sample |
Gender |
Grade |
Enrollment status |
---|---|---|---|
1 |
1 |
0.73 |
1,0 |
2 |
1 |
0.81 |
1,0 |
3 |
1 |
0.86 |
1,0 |
4 |
0 |
0.65 |
1,0 |
5 |
0 |
0.45 |
1,0 |
6 |
1 |
0.70 |
0,1 |
7 |
0 |
0.51 |
0,1 |
8 |
1 |
0.89 |
0,1 |
9 |
1 |
0.79 |
0,1 |
10 |
0 |
0.54 |
0,1 |
The following figure displays the architecture of the neural net to solve this problem:
Now, let's analyze the test method named testBackpropagation()
. It is as follows:
private void testBackpropagation(){ NeuralNet testNet = new NeuralNet(); testNet = testNet.initNet(2, 1, 3, 2); System.out.println("---BACKPROPAGATION INIT NET---"); testNet.printNet(testNet); NeuralNet trainedNet = new NeuralNet(); // first column has BIAS testNet.setTrainSet(new double[][] { { 1.0, 1.0, 0.73 }, { 1.0, 1.0, 0.81 }, { 1.0, 1.0, 0.86 }, { 1.0, 1.0, 0.95 }, { 1.0, 0.0, 0.45 }, { 1.0, 1.0, 0.70 }, { 1.0, 0.0, 0.51 }, { 1.0, 1.0, 0.89 }, { 1.0, 1.0, 0.79 }, { 1.0, 0.0, 0.54 } }); testNet.setRealMatrixOutputSet(new double[][] { {1.0, 0.0}, {1.0, 0.0}, {1.0, 0.0}, {1.0, 0.0}, {1.0, 0.0}, {0.0, 1.0}, {0.0, 1.0}, {0.0, 1.0}, {0.0, 1.0}, {0.0, 1.0} }); testNet.setMaxEpochs(1000); testNet.setTargetError(0.002); testNet.setLearningRate(0.1); testNet.setTrainType(TrainingTypesENUM.BACKPROPAGATION); testNet.setActivationFnc(ActivationFncENUM.SIGLOG); testNet.setActivationFncOutputLayer( ActivationFncENUM.LINEAR ); trainedNet = testNet.trainNet(testNet); System.out.println(); System.out.println("---BACKPROPAGATION TRAINED NET---"); testNet.printNet(trainedNet); }
The backpropagation test logic is similar to Adaline's and perceptron's. First, an object of the NeuralNet
class is created and used for initializing the net with two neurons in the input layer, one hidden layer with three neurons, and two neurons in the output layer. The data to train is taken from the preceding table. The maximum number of epochs is large, because the backpropagation algorithm prolongs the learning process. To conclude, the backpropagation-trained net weights and the MSE list are printed. A summary of the results is shown in the following figure:
Analyzing the graphic by using the MSE of each epoch plotted in the following figure, it is possible to conclude that neural net learned to classify, on the basis of gender and grade, whether a person will or will not enroll at this university.