Likelihood Function of A Parameterized Model
Among many applications, we consider here one of broad theoretical and practical importance. Given a parameterized family of probability density functions (or probability mass functions in the case of discrete distributions)
where θ is the parameter, the likelihood function is
written
where x is the observed outcome of an experiment. In other words, when f(x | θ) is viewed as a function of x with θ fixed, it is a probability density function, and when viewed as a function of θ with x fixed, it is a likelihood function.
Note: This is not the same as the probability that those parameters are the right ones, given the observed sample. Attempting to interpret the likelihood of a hypothesis given observed evidence as the probability of the hypothesis is a common error, with potentially disastrous real-world consequences in medicine, engineering or jurisprudence. See prosecutor's fallacy for an example of this.
From a geometric standpoint, if we consider f (x, θ) as a function of two variables then the family of probability distributions can be viewed as a family of curves parallel to the x-axis, while the family of likelihood functions are the orthogonal curves parallel to the θ-axis.
Read more about this topic: Likelihood Function
Famous quotes containing the words likelihood, function and/or model:
“What likelihood is there of corrupting a man who has no ambition?”
—Samuel Richardson (16891761)
“It is not the function of our Government to keep the citizen from falling into error; it is the function of the citizen to keep the Government from falling into error.”
—Robert H. [Houghwout] Jackson (18921954)
“Socrates, who was a perfect model in all great qualities, ... hit on a body and face so ugly and so incongruous with the beauty of his soul, he who was so madly in love with beauty.”
—Michel de Montaigne (15331592)