The mathematical theory of information is based on probability theory and statistics, and measures information with several quantities of information. The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. The most common unit of information is the bit, based on the binary logarithm. Other units include the nat, based on the natural logarithm, and the hartley, based on the base 10 or common logarithm.
In what follows, an expression of the form is considered by convention to be equal to zero whenever p is zero. This is justified because for any logarithmic base.
Read more about Quantities Of Information: Self-information, Entropy, Joint Entropy, Conditional Entropy (equivocation), Kullback–Leibler Divergence (information Gain), Mutual Information (transinformation), Differential Entropy
Famous quotes containing the words quantities of and/or quantities:
“Compilers resemble gluttonous eaters who devour excessive quantities of healthy food just to excrete them as refuse.”
—Franz Grillparzer (17911872)
“Compilers resemble gluttonous eaters who devour excessive quantities of healthy food just to excrete them as refuse.”
—Franz Grillparzer (17911872)