Ordinary Least Squares
In statistics, ordinary least squares (OLS) or linear least squares is a method for estimating the unknown parameters in a linear regression model. This method minimizes the sum of squared vertical distances between the observed responses in the dataset and the responses predicted by the linear approximation. The resulting estimator can be expressed by a simple formula, especially in the case of a single regressor on the right-hand side.
The OLS estimator is consistent when the regressors are exogenous and there is no perfect multicollinearity, and optimal in the class of linear unbiased estimators when the errors are homoscedastic and serially uncorrelated. Under these conditions, the method of OLS provides minimum-variance mean-unbiased estimation when the errors have finite variances. Under the additional assumption that the errors be normally distributed, OLS is the maximum likelihood estimator. OLS is used in economics (econometrics) and electrical engineering (control theory and signal processing), among many areas of application.
Read more about Ordinary Least Squares: Linear Model, Estimation, Alternative Derivations, Finite Sample Properties, Large Sample Properties, Example With Real Data
Famous quotes containing the words ordinary and/or squares:
“Murder in the murderer is no such ruinous thought as poets and romancers will have it; it does not unsettle him, or fright him from his ordinary notice of trifles: it is an act quite easy to be contemplated, but in its sequel, it turns out to be a horrible jangle and confounding of all relations.”
—Ralph Waldo Emerson (18031882)
“An afternoon of nurses and rumours;
The provinces of his body revolted,
The squares of his mind were empty,
Silence invaded the suburbs,”
—W.H. (Wystan Hugh)