Practical example 2 – predicting enrolment status

In Brazil, one of the ways for a person to enter university consists of taking an exam and if he/she achieves the minimum grade for the course that he/she is seeking, then he/she can enroll. To demonstrate the backpropagation algorithm, let us consider this scenario. The data shown in the table was collected from a university database. The second column represents the person's gender (one means female and zero means male); the third column has grades scaled by 100 and the last column is formed by two neurons (1,0 means performed enrollment and 0,1 means waiver enrollment):

Sample

Gender

Grade

Enrollment

Status

1

1

0.73

1 0

2

1

0.81

1 0

3

1

0.86

1 0

4

0

0.65

1 0

5

0

0.45

1 0

6

1

0.70

0 1

7

0

0.51

0 1

8

1

0.89

0 1

9

1

0.79

0 1

10

0

0.54

0 1

Double[][] _neuralDataSet = {
            {1.0,   0.73,   1.0,    -1.0}
        ,   {1.0,   0.81,   1.0,    -1.0}
        ,   {1.0,   0.86,   1.0,    -1.0}
        ,   {0.0,   0.65,   1.0,    -1.0}
        ,   {0.0,   0.45,   1.0,    -1.0}
        ,   {1.0,   0.70,   -1.0,    1.0}
        ,   {0.0,   0.51,   -1.0,    1.0}
        ,   {1.0,   0.89,   -1.0,    1.0}
        ,   {1.0,   0.79,   -1.0,    1.0}
        ,   {0.0,   0.54,   -1.0,    1.0}

        };
        
int[] inputColumns = {0,1};
int[] outputColumns = {2,3};
       
NeuralDataSet neuralDataSet = new NeuralDataSet(_neuralDataSet,inputColumns,outputColumns);

We create a neural network containing three neurons in the hidden layer, as shown in the following figure:

Practical example 2 – predicting enrolment status
int numberOfInputs = 2;
int numberOfOutputs = 2;
int[] numberOfHiddenNeurons={5};
   
Linear outputAcFnc = new Linear(1.0);
Sigmoid hdAcFnc = new Sigmoid(1.0);
IActivationFunction[] hiddenAcFnc={hdAcFnc  };
       
NeuralNet nnlm = new NeuralNet(numberOfInputs,numberOfOutputs
                ,numberOfHiddenNeurons,hiddenAcFnc,outputAcFnc);
        
NeuralNet nnelm = new NeuralNet(numberOfInputs,numberOfOutputs
                ,numberOfHiddenNeurons,hiddenAcFnc,outputAcFnc);

We've also set up the learning algorithms Levenberg-Marquardt and extreme learning machines:

LevenbergMarquardt lma = new LevenbergMarquardt(nnlm,
neuralDataSet,
LearningAlgorithm.LearningMode.BATCH);
lma.setDamping(0.001);
lma.setMaxEpochs(100);
lma.setMinOverallError(0.0001);
        
ELM elm = new ELM(nnelm,neuralDataSet);
elm.setMinOverallError(0.0001);
elm.printTraining=true;

Running the training, we find that the training was successful. For the Levenberg-Marquardt algorithm, the minimum satisfied error was found after nine epochs:

Practical example 2 – predicting enrolment status

And the extreme learning machine found an error near to zero:

Practical example 2 – predicting enrolment status
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset