In probability theory, the expected value (or expectation, mathematical expectation, EV, mean, or the first moment) of a random variable is the weighted average of all possible values that this random variable can take on. The weights used in computing this average correspond to the probabilities in case of a discrete random variable, or densities in case of a continuous random variable. From a rigorous theoretical standpoint, the expected value is the integral of the random variable with respect to its probability measure.
The expected value may be intuitively understood by the law of large numbers: the expected value, when it exists, is almost surely the limit of the sample mean as sample size grows to infinity. More informally, it can be interpreted as the long-run average of the results of many independent repetitions of an experiment (e.g. a dice roll). The value may not be expected in the ordinary sense—the "expected value" itself may be unlikely or even impossible (such as having 2.5 children), just like the sample mean.
The expected value does not exist for some distributions with large "tails", such as the Cauchy distribution.
Read more about Expected Value: History, Uses and Applications, Expectation of Matrices
Famous quotes containing the word expected:
“[My father] was a lazy man. It was the days of independent incomes, and if you had an independent income you didnt work. You werent expected to. I strongly suspect that my father would not have been particularly good at working anyway. He left our house in Torquay every morning and went to his club. He returned, in a cab, for lunch, and in the afternoon went back to the club, played whist all afternoon, and returned to the house in time to dress for dinner.”
—Agatha Christie (18911976)