In probability theory, the multinomial distribution is a generalization of the binomial distribution.
The binomial distribution is the probability distribution of the number of "successes" in n independent Bernoulli trials, with the same probability of "success" on each trial. In a multinomial distribution, the analog of the Bernoulli distribution is the categorical distribution, where each trial results in exactly one of some fixed finite number k of possible outcomes, with probabilities p1, ..., pk (so that pi ≥ 0 for i = 1, ..., k and ), and there are n independent trials. Then let the random variables Xi indicate the number of times outcome number i was observed over the n trials. The vector X = (X1, ..., Xk) follows a multinomial distribution with parameters n and p, where p = (p1, ..., pk).
Note that, in some fields, such as natural language processing, the categorical and multinomial distributions are conflated, and it is common to speak of a "multinomial distribution" when a categorical distribution is actually meant. This stems from the fact that it is sometimes convenient to express the outcome of a categorical distribution as a "1-of-K" vector (a vector with one element containing a 1 and all other elements containing a 0) rather than as an integer in the range ; in this form, a categorical distribution is equivalent to a multinomial distribution over a single observation.
Read more about Multinomial Distribution: Properties, Example, Sampling From A Multinomial Distribution, To Simulate A Multinomial Distribution, Related Distributions
Famous quotes containing the word distribution:
“There is the illusion of time, which is very deep; who has disposed of it? Mor come to the conviction that what seems the succession of thought is only the distribution of wholes into causal series.”
—Ralph Waldo Emerson (18031882)