Ordinary least squares is a method for estimating the unknown parameters in a linear regression model. This method minimizes the sum of squared vertical distances between the observed responses in the dataset and the responses predicted by the linear approximation. The resulting estimator can be expressed by a simple formula, especially in the case of a single regressor on the right hand side. The ordinary least square estimator is consistent when the regressors are exogenous and there is no perfect multicollinearity, and optimal in the class of linear unbiased estimators when the errors are homoscedastic and serially uncorrelated. Ordinary least square is the maximum likelihood estimator.

There are many frameworks in which the linear regression model can be cast in order to make the ordinary least square technique applicable. Each of these settings produces the same formulas and same results; the only difference is the interpretation and the assumptions which have to be imposed in order for the method to give meaningful results. One of the lines of difference in interpretation is whether to treat the regressors as random variables, or as predefined constants.

Ordinary least squares analysis often includes the use of diagnostic plots designed to detect departures of the data from the assumed form of the model. Residuals against the explanatory variable, residuals against explanatory variables, residuals against the fitted values and residuals against the preceding residuals are common diagnostic plots.

© BrainMass Inc. brainmass.com October 25, 2020, 7:04 am ad1c9bdddf