Completeness (statistics) - Definition

Definition

Consider a random variable X whose probability distribution belongs to a parametric family of probability distributions Pθ parametrized by θ.

Formally, a statistic s is a measurable function of X; thus, a statistic s is evaluated on a random variable X, taking the value s(X), which is itself a random variable. A given realization of the random variable X(ω) is a data-point (datum), on which the statistic s takes the value s(X(ω)).

The statistic s is said to be complete for the distribution of X if for every measurable function g (which must be independent of θ) the following implication holds:

E(g(s(X))) = 0 for all θ implies that Pθ(g(s(X)) = 0) = 1 for all θ.

The statistic s is said to be boundedly complete if the implication holds for all bounded functions g.

Read more about this topic:  Completeness (statistics)

Famous quotes containing the word definition:

    ... if, as women, we accept a philosophy of history that asserts that women are by definition assimilated into the male universal, that we can understand our past through a male lens—if we are unaware that women even have a history—we live our lives similarly unanchored, drifting in response to a veering wind of myth and bias.
    Adrienne Rich (b. 1929)

    No man, not even a doctor, ever gives any other definition of what a nurse should be than this—”devoted and obedient.” This definition would do just as well for a porter. It might even do for a horse. It would not do for a policeman.
    Florence Nightingale (1820–1910)

    Scientific method is the way to truth, but it affords, even in
    principle, no unique definition of truth. Any so-called pragmatic
    definition of truth is doomed to failure equally.
    Willard Van Orman Quine (b. 1908)