Channel Capacity

In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel. By the noisy-channel coding theorem, the channel capacity of a given channel is the limiting information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability.

Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution.

Read more about Channel Capacity:  Formal Definition, Noisy-channel Coding Theorem, Example Application, Channel Capacity in Wireless Communications

Famous quotes containing the words channel and/or capacity:

    There is the falsely mystical view of art that assumes a kind of supernatural inspiration, a possession by universal forces unrelated to questions of power and privilege or the artist’s relation to bread and blood. In this view, the channel of art can only become clogged and misdirected by the artist’s concern with merely temporary and local disturbances. The song is higher than the struggle.
    Adrienne Rich (b. 1929)

    The idea was to prove at every foot of the way up that you were one of the elected and anointed ones who had the right stuff and could move higher and higher and even—ultimately, God willing, one day—that you might be able to join that special few at the very top, that elite who had the capacity to bring tears to men’s eyes, the very Brotherhood of the Right Stuff itself.
    Tom Wolfe (b. 1931)