Definition
Consider a random variable X whose probability distribution belongs to a parametric family of probability distributions Pθ parametrized by θ.
Formally, a statistic s is a measurable function of X; thus, a statistic s is evaluated on a random variable X, taking the value s(X), which is itself a random variable. A given realization of the random variable X(ω) is a data-point (datum), on which the statistic s takes the value s(X(ω)).
The statistic s is said to be complete for the distribution of X if for every measurable function g (which must be independent of θ) the following implication holds:
- E(g(s(X))) = 0 for all θ implies that Pθ(g(s(X)) = 0) = 1 for all θ.
The statistic s is said to be boundedly complete if the implication holds for all bounded functions g.
Read more about this topic: Completeness (statistics)
Famous quotes containing the word definition:
“Scientific method is the way to truth, but it affords, even in
principle, no unique definition of truth. Any so-called pragmatic
definition of truth is doomed to failure equally.”
—Willard Van Orman Quine (b. 1908)
“One definition of man is an intelligence served by organs.”
—Ralph Waldo Emerson (18031882)
“The very definition of the real becomes: that of which it is possible to give an equivalent reproduction.... The real is not only what can be reproduced, but that which is always already reproduced. The hyperreal.”
—Jean Baudrillard (b. 1929)