A special case of Generalized least squares called weighted least squares occurs when all the off-diagonal entries of Ω (the correlation matrix of the residuals) are 0.
The expressions given above are based on the implicit assumption that the errors are uncorrelated with each other and with the independent variables and have equal variance. The Gauss–Markov theorem shows that, when this is so, is a best linear unbiased estimator (BLUE). If, however, the measurements are uncorrelated but have different uncertainties, a modified approach might be adopted. Aitken showed that when a weighted sum of squared residuals is minimized, is BLUE if each weight is equal to the reciprocal of the variance of the measurement.
The gradient equations for this sum of squares are
which, in a linear least squares system give the modified normal equations,
When the observational errors are uncorrelated and the weight matrix, W, is diagonal, these may be written as
If the errors are correlated, the resulting estimator is BLUE if the weight matrix is equal to the inverse of the variance-covariance matrix of the observations.
When the errors are uncorrelated, it is convenient to simplify the calculations to factor the weight matrix as . The normal equations can then be written as
where
For non-linear least squares systems a similar argument shows that the normal equations should be modified as follows.
Note that for empirical tests, the appropriate W is not known for sure and must be estimated. For this Feasible Generalized Least Squares (FGLS) techniques may be used.
Read more about this topic: Least Squares
Famous quotes containing the word squares:
“An afternoon of nurses and rumours;
The provinces of his body revolted,
The squares of his mind were empty,
Silence invaded the suburbs,”
—W.H. (Wystan Hugh)