Kolmogorov Complexity - History and Context

History and Context

Algorithmic information theory is the area of computer science that studies Kolmogorov complexity and other complexity measures on strings (or other data structures).

The concept and theory of Kolmogorov Complexity is based on a crucial theorem first discovered by Ray Solomonoff, who published it in 1960, describing it in "A Preliminary Report on a General Theory of Inductive Inference" as part of his invention of algorithmic probability. He gave a more complete description in his 1964 publications, "A Formal Theory of Inductive Inference," Part 1 and Part 2 in Information and Control.

Andrey Kolmogorov later independently published this theorem in Problems Inform. Transmission, Gregory Chaitin also presents this theorem in J. ACM – Chaitin's paper was submitted October 1966 and revised in December 1968, and cites both Solomonoff's and Kolmogorov's papers.

The theorem says that, among algorithms that decode strings from their descriptions (codes), there exists an optimal one. This algorithm, for all strings, allows codes as short as allowed by any other algorithm up to an additive constant that depends on the algorithms, but not on the strings themselves. Solomonoff used this algorithm, and the code lengths it allows, to define a "universal probability" of a string on which inductive inference of the subsequent digits of the string can be based. Kolmogorov used this theorem to define several functions of strings, including complexity, randomness, and information.

When Kolmogorov became aware of Solomonoff's work, he acknowledged Solomonoff's priority. For several years, Solomonoff's work was better known in the Soviet Union than in the Western World. The general consensus in the scientific community, however, was to associate this type of complexity with Kolmogorov, who was concerned with randomness of a sequence, while Algorithmic Probability became associated with Solomonoff, who focused on prediction using his invention of the universal a priori probability distribution.

There are several other variants of Kolmogorov complexity or algorithmic information. The most widely used one is based on self-delimiting programs, and is mainly due to Leonid Levin (1974).

An axiomatic approach to Kolmogorov complexity based on Blum axioms (Blum 1967) was introduced by Mark Burgin in the paper presented for publication by Andrey Kolmogorov (Burgin 1982).

Some consider that naming the concept "Kolmogorov complexity" is an example of the Matthew effect.

Read more about this topic:  Kolmogorov Complexity

Famous quotes containing the words history and/or context:

    The only thing worse than a liar is a liar that’s also a hypocrite!
    There are only two great currents in the history of mankind: the baseness which makes conservatives and the envy which makes revolutionaries.
    Edmond De Goncourt (1822–1896)

    The hippie is the scion of surplus value. The dropout can only claim sanctity in a society which offers something to be dropped out of—career, ambition, conspicuous consumption. The effects of hippie sanctimony can only be felt in the context of others who plunder his lifestyle for what they find good or profitable, a process known as rip-off by the hippie, who will not see how savagely he has pillaged intricate and demanding civilizations for his own parodic lifestyle.
    Germaine Greer (b. 1939)