Consistent Estimator

Consistent Estimator

In statistics, a sequence of estimators for parameter θ0 is said to be consistent (or asymptotically consistent) if this sequence converges in probability to θ0. It means that the distributions of the estimators become more and more concentrated near the true value of the parameter being estimated, so that the probability of the estimator being arbitrarily close to θ0 converges to one.

In practice one usually constructs a single estimator as a function of an available sample of size n, and then imagines being able to keep collecting data and expanding the sample ad infinitum. In this way one would obtain a sequence of estimators indexed by n and the notion of consistency will be understood as the sample size “grows to infinity”. If this sequence converges in probability to the true value θ0, we call it a consistent estimator; otherwise the estimator is said to be inconsistent.

The consistency as defined here is sometimes referred to as the weak consistency. When we replace the convergence in probability with the almost sure convergence, then the sequence of estimators is said to be strongly consistent.

Read more about Consistent Estimator:  Definition, Establishing Consistency

Famous quotes containing the word consistent:

    Those who first introduced compulsory education into American life knew exactly why children should go to school and learn to read: to save their souls.... Consistent with this goal, the first book written and printed for children in America was titled Spiritual Milk for Boston Babes in either England, drawn from the Breasts of both Testaments for their Souls’ Nourishment.
    Dorothy H. Cohen (20th century)