Logistic regression

We can, in a very similar way, implement logistic regression as an R6 class. The code is included just for intellectual curiosity, as it is closely related. 

There is not a lot of difference between the perceptron and logistic regression. They share a lot in common, the main difference being the activation function (logit instead of the Heaviside step function), which also changes the update rule for the weights. For convenience, we highlight in bold the more relevant differences:

library(R6)
logit <- function(x){
1/(1+exp(-x))
}
LR <- R6Class("LR",
public = list(
dim = NULL,
n_iter = NULL,
learning_rate = NULL,
w = NULL,
initialize = function(learning_rate = 0.25, n_iter=100, dim=2){
self$n_iter <- n_iter
self$learning_rate <- learning_rate
self$dim <- dim
self$w <- matrix(runif(self$dim+1), ncol = self$dim+1)
}
, forward = function(x){
dot_product <- sum(x*self$w)
y <- logit(dot_product)
return(y)
}
, backward = function(t,y,x){
for(j in 1:ncol(x)){
self$w[j] <- self$w[j]+self$learning_rate*(t-y)*x[j]*logit(x[j])*(1-logit(x[j]))
}

}
, train = function(X,t){
X <- cbind(-1,X) #add bias term
n_examples <- nrow(X)

for(iter in 1:self$n_iter){
for(i in 1:nrow(X)){
y_i <- self$forward(X[i,])
self$backward(t[i],y_i, X[i,])
}
if(iter %% 20 == 0){
cat("Iteration: ", iter)
print("Weights: ")
print(unlist(self$w))
}

}
}
, predict = function(X){
X <- cbind(-1,X) #add bias
preds <- c()
for(i in 1:nrow(X)){
preds[i] <- self$forward(X[i,])
}
return(preds)
}
)
)

As we can see, there are not a lot of changes with respect to the previous code, the main action happening on the backward step. 

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset