In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel. By the noisy-channel coding theorem, the channel capacity of a given channel is the limiting information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability.
Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution.
Read more about Channel Capacity: Formal Definition, Noisy-channel Coding Theorem, Example Application, Channel Capacity in Wireless Communications
Famous quotes containing the words channel and/or capacity:
“For, rightly, every man is a channel through which heaven floweth, and, whilst I fancied I was criticising him, I was censuring or rather terminating my own soul.”
—Ralph Waldo Emerson (18031882)
“A creative writer must study carefully the works of his rivals, including the Almighty. He must possess the inborn capacity not only of recombining but of re-creating the given world. In order to do this adequately, avoiding duplication of labor, the artist should know the given world.”
—Vladimir Nabokov (18991977)