Simple linear regression is the least squares estimator of a linear regression model with a single explanatory variable. Simple linear regression fits a straight line through the set of n points in such a way that makes the sum of squared residuals of the model as small as possible. “Simple” refers to the fact that this regression is one of the simplest in statistics. The slope of the fitted line is equal to the correlation between y and x corrected by the ratio of standard deviations of these variables. The intercept of the fitted line is such that it passes through the center of mass of the data points.
There are three numerical properties of simple linear regression. These are as followed:
- The line goes through the “center of mass” points (x(bar), y(bar))
- The sum of the residuals is equal to zero, if the model includes a constant
- The linear combination of the residuals, in which the coefficients are the x-value, is equal to zero
The description of statistical properties of estimators, from a simple linear regression, estimates require a statistical model. This is based on the assumption of the validity of a model under which the estimates are optimal. It is also possible to evaluate the properties under other assumptions.