Context mixing is a type of data compression algorithm in which the next-symbol predictions of two or more statistical models are combined to yield a prediction that is often more accurate than any of the individual predictions. For example, one simple method (not necessarily the best) is to average the probabilities assigned by each model. The random forest is another method: it outputs the prediction that is the mode of the predictions output by individual models. Combining models is an active area of research in machine learning.
The PAQ series of data compression programs use context mixing to assign probabilities to individual bits of the input.
Read more about Context Mixing: Application To Data Compression
Famous quotes containing the words context and/or mixing:
“Parents are led to believe that they must be consistent, that is, always respond to the same issue the same way. Consistency is good up to a point but your child also needs to understand context and subtlety . . . much of adult life is governed by context: what is appropriate in one setting is not appropriate in another; the way something is said may be more important than what is said. . . .”
—Stanley I. Greenspan (20th century)
“It was not till the middle of the second dance, when, from some pauses in the movement wherein they all seemed to look up, I fancied I could distinguish an elevation of spirit different from that which is the cause or the effect of simple jollity.In a word, I thought I beheld Religion mixing in the dance.”
—Laurence Sterne (17131768)