Context mixing is a type of data compression algorithm in which the next-symbol predictions of two or more statistical models are combined to yield a prediction that is often more accurate than any of the individual predictions. For example, one simple method (not necessarily the best) is to average the probabilities assigned by each model. The random forest is another method: it outputs the prediction that is the mode of the predictions output by individual models. Combining models is an active area of research in machine learning.
The PAQ series of data compression programs use context mixing to assign probabilities to individual bits of the input.
Read more about Context Mixing: Application To Data Compression
Famous quotes containing the words context and/or mixing:
“Among the most valuable but least appreciated experiences parenthood can provide are the opportunities it offers for exploring, reliving, and resolving ones own childhood problems in the context of ones relation to ones child.”
—Bruno Bettelheim (20th century)
“Give me Catholicism every time. Father Cheeryble with his thurible; Father Chatterjee with his liturgy. What fun they have with all their charades and conundrums! If it werent for the Christianity they insist on mixing in with it, Id be converted tomorrow.”
—Aldous Huxley (18941963)