Regressions
Updated at 2017-11-13 14:39
Regression analysis is a statistical process for estimating the relationships among variables.
In regression machine learning problems, the desired output is a continuous number e.g. age of a person.
- Simple Linear Regression: a linear predictor function with one input variable e.g.
(y = x^2)
. - Multiple Linear Regression: a linear predictor function with more than one input variable.
- Ordinal Regression: model output is an ordinal variables thus have an order but unknown distance between themselves e.g. "good", "ok", "poor".
- Nonlinear Regression: the predictor function is nonlinear such as exponentiation^2. There may be multiple local minima to optimize.
- General Linear Model: a matrix formula (
y = xb + u
) which has input matrixx
, output matrixy
, model coefficiency matrix (b
) and error matrixu
.b
works like neural network weights andu
like bias, finding the right matrix values for those will allow creating predictions. - Generalized Linear Model (GLM): allows for outputs that have error distribution models other than a normal distribution by allowing the linear model to be related to the response variable via a link function.
Logistic Regressions
- Logistic Regression predictions are categorical.
- Binary Logistic Regression predictions are either 0 or 1.
- Multinomial Logistic Regression predictions can be more than 2 possible discrete outcomes.
- Ordered Logistic Regression predictions are ordinal e.g. "poor", "fair" and "good".
Sources
- sklearn - ML General Concepts
- The Master Algorithm, Pedro Domingos
- Wikipedia