Context mixing is a type of data compression algorithm in which the next-symbol predictions of two or more statistical models are combined to yield a prediction that is often more accurate than any of the individual predictions. For example, one simple method (not necessarily the best) is to average the probabilities assigned by each model. The random forest is another method: it outputs the prediction that is the mode of the predictions output by individual models. Combining models is an active area of research in machine learning.
The PAQ series of data compression programs use context mixing to assign probabilities to individual bits of the input.
Read more about Context Mixing: Application To Data Compression
Famous quotes containing the words context and/or mixing:
“Parents are led to believe that they must be consistent, that is, always respond to the same issue the same way. Consistency is good up to a point but your child also needs to understand context and subtlety . . . much of adult life is governed by context: what is appropriate in one setting is not appropriate in another; the way something is said may be more important than what is said. . . .”
—Stanley I. Greenspan (20th century)
“Political image is like mixing cement. When its wet, you can move it around and shape it, but at some point it hardens and theres almost nothing you can do to reshape it.”
—Walter F. Mondale (b. 1928)