Support vector machine (SVM) is a supervised machine learning method. As previously seen, we can use this method for regression, but also for classification. The principle of this algorithm is to find a hyper plan that separates the data into two classes.
Let's have a look at the following code, that implements the same:
# Fit the model
svc=SVC()
svc.fit(X_train, Y_train)
# Forecast value
goog_data['Predicted_Signal']=svc.predict(X)
goog_data['GOOG_Returns']=np.log(goog_data['Close']/
goog_data['Close'].shift(1))
cum_goog_return=calculate_return(goog_data,split_value=len(X_train),symbol='GOOG')
cum_strategy_return= calculate_strategy_return(goog_data,split_value=len(X_train))
plot_chart(cum_goog_return, cum_strategy_return,symbol='GOOG')
In this example, the following applies:
- Instead of instantiating a class to create a KNN method, we used the SVC class.
- The class constructor has several parameters adjusting the behavior of the method to the data you will work on.
- The most important one is the parameter kernel. This defines the method of building the hyper plan.
- In this example, we just use the default values of the constructor.
Now, let's have a look at the output of the code: