Neural network

Finally, we arrive at the neural network model. Note that our training set only has approximately 18,000 observations; the most successful neural network models (for example, "deep learning" models) typically use millions or even billions of observations. Nevertheless, let's see how our neural network model fares. For neural networks, it is recommended that the data is scaled appropriately (for example, having a standard distribution with a mean equal to 0 and a standard deviation equal to 1). We use the StandardScaler class to accomplish this:

from sklearn.preprocessing import StandardScaler
from sklearn.neural_network import MLPClassifier

# Scale data
scaler = StandardScaler()
scaler.fit(X_train)
X_train_Tx = scaler.transform(X_train)
X_test_Tx = scaler.transform(X_test)

# Fit models that require scaling (e.g. neural networks)
hl_sizes = [150,100,80,60,40,20]
nn_clfs = [MLPClassifier(hidden_layer_sizes=(size,), random_state=2345, verbose=True) for size in hl_sizes]

for num, nn_clf in enumerate(nn_clfs):
print(str(hl_sizes[num]) + '-unit network:')
nn_clf.fit(X_train_Tx, y_train.ravel())
print('Training accuracy: ' + str(nn_clf.score(X_train_Tx, y_train)))
print('Validation accuracy: ' + str(nn_clf.score(X_test_Tx, y_test)))

Once you run the preceding cell, you will see iterations being completed, and once the iterations fail to result in improvements to the model, the training will stop and the accuracies will be printed. In our run, the validation accuracy of the model having 150 cells in its hidden layer was 87%, higher than the other hidden layer sizes.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset