Different types of regression

This section will cover different types of regression:

  • Linear regression: This is the oldest type and most widely known type of regression. In this the dependent variable is continuous and the independent variable can be discrete or continuous and the regression line is linear. Linear regression is very sensitive to outliers and cross-correlations.
  • Logistic regression: This is used when the dependent variable is binary in nature (0 or 1, success or failure, survived or died, yes or no, true or false). It is widely used in clinical trials, fraud detection, and so on. It does not require there to be a linear relationship between dependent and independent variables.
  • Polynomial regression: This implies of polynomial equation here the power of the independent variable is more than one. In this case the regression line is not a straight line, but a curved line.
  • Ridge regression: This is a more robust version of linear regression and is used when data variables are highly correlated. Using some constraints on regression coefficients, it is made some more natural, closer to real estimates.
  • Lasso regression: This is like ridge regression by penalizing the absolute size of regression coefficients. It also automatically performs the variable reduction.
  • Ecologic regression: This is used when data is divided in different group or strata, performing regression per group or strata. One should be cautious when using this kind of regression as best regression may get over shadowed by noisy ones.
  • Logic regression: This is same as Logistic regression, but it is used in scoring algorithms where all variables are of binary nature.
  • Bayesian regression: This uses Bayesian interface of conditional probability. It uses the same approach as ridge regression, which involved penalizing estimator, making it more flexible and stable. It assumes some prior knowledge about regression coefficient and error terms and the error information is loaded with approximated probability distribution.
  • Quantile regression: This is used when the area of interest is to study extreme limits, for example pollution level to study interest in death due to pollution. In this conditional quantile function of independent variable is used. It tries to estimate the value with conditional quantiles.
  • Jackknife regression: This uses a resampling algorithm to remove the bias and variance.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset