Unicity Distance - Relation With Key Entropy and Plaintext Redundancy

Relation With Key Entropy and Plaintext Redundancy

The unicity distance can also be defined as the minimum amount of ciphertext-only required to permit a computationally unlimited adversary to recover the unique encryption key.

The expected unicity distance is accordingly:

where U is the unicity distance, H(k) is the entropy of the key space (e.g. 128 for 2128 equiprobable keys, rather less if the key is a memorized pass-phrase).

D is defined as the plaintext redundancy in bits per character.

Now an alphabet of 32 characters can carry 5 bits of information per character (as 32 = 25). In general the number of bits of information is lg N, where N is the number of characters in the alphabet. So for English each character can convey lg 26 = 4.7 bits of information. Remember that lg is meant as the logarithm for base two in this case. See Binary logarithm for details.

However the average amount of actual information carried per character in meaningful English text is only about 1.5 bits per character. So the plain text redundancy is D = 4.7 − 1.5 = 3.2.

Basically the bigger the unicity distance the better. For a one time pad, given the unbounded entropy of the key space, we have, which is consistent with the one-time pad being theoretically unbreakable.

For a simple substitution cipher, the number of possible keys is 26! = 4.0329 × 1026, the number of ways in which the alphabet can be permuted. Assuming all keys are equally likely, H(k) = lg(26!) = 88.4 bits. For English text D = 3.2, thus U = 88.4/3.2 = 28.

So given 28 characters of ciphertext it should be theoretically possible to work out an English plaintext and hence the key.

Read more about this topic:  Unicity Distance

Famous quotes containing the words relation with, relation, key and/or entropy:

    To criticize is to appreciate, to appropriate, to take intellectual possession, to establish in fine a relation with the criticized thing and to make it one’s own.
    Henry James (1843–1916)

    Skepticism is unbelief in cause and effect. A man does not see, that, as he eats, so he thinks: as he deals, so he is, and so he appears; he does not see that his son is the son of his thoughts and of his actions; that fortunes are not exceptions but fruits; that relation and connection are not somewhere and sometimes, but everywhere and always; no miscellany, no exemption, no anomaly,—but method, and an even web; and what comes out, that was put in.
    Ralph Waldo Emerson (1803–1882)

    Now narrow minds can develop as well through persecution as through benevolence; they can assure themselves of their power by tyrannizing cruelly or beneficently over others; they go the way their nature guides them. Add to this the guidance of interest, and you will have the key to most social riddles.
    Honoré De Balzac (1799–1850)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)