Likelihood-ratio Test - Simple-versus-simple Hypotheses

Simple-versus-simple Hypotheses

A statistical model is often a parametrized family of probability density functions or probability mass functions . A simple-vs-simple hypotheses test has completely specified models under both the null and alternative hypotheses, which for convenience are written in terms of fixed values of a notional parameter :


\begin{align}
H_0 &:& \theta=\theta_0 ,\\
H_1 &:& \theta=\theta_1 .
\end{align}

Note that under either hypothesis, the distribution of the data is fully specified; there are no unknown parameters to estimate. The likelihood ratio test statistic can be written as:


\Lambda(x) = \frac{ L(\theta_0|x) }{ L(\theta_1|x) } = \frac{ f(x|\theta_0) }{ f(x|\theta_1) }

or

where is the likelihood function. Note that some references may use the reciprocal as the definition. In the form stated here, the likelihood ratio is small if the alternative model is better than the null model and the likelihood ratio test provides the decision rule as:

If, do not reject ;
If, reject ;
Reject with probability if

The values are usually chosen to obtain a specified significance level, through the relation: . The Neyman-Pearson lemma states that this likelihood ratio test is the most powerful among all level- tests for this problem.

Read more about this topic:  Likelihood-ratio Test

Famous quotes containing the word hypotheses:

    We shall do better to abandon the whole attempt to learn the truth ... unless we can trust to the human mind’s having such a power of guessing right that before very many hypotheses shall have been tried, intelligent guessing may be expected to lead us to one which will support all tests, leaving the vast majority of possible hypotheses unexamined.
    Charles S. Pierce (1839–1914)