Channel Capacity
The AWGN channel is represented by a series of outputs at discrete time event index . is the sum of the input and noise, where is independent and identically distributed and drawn from a zero-mean normal distribution with variance (the noise). The are further assumed to not be correlated with the .
The capacity of the channel is infinite unless the noise n is nonzero, and the are sufficiently constrained. The most common constraint on the input is the so-called "power" constraint, requiring that for a codeword transmitted through the channel, we have:
where represents the maximum channel power. Therefore, the channel capacity for the power-constrained channel is given by:
Where is the distribution of . Expand, writing it in terms of the differential entropy:
But and are independent, therefore:
Evaluating the differential entropy of a Gaussian gives:
Because and are independent and their sum gives :
From this bound, we infer from a property of the differential entropy that
Therefore the channel capacity is given by the highest achievable bound on the mutual information:
Where is maximized when:
Thus the channel capacity for the AWGN channel is given by:
Read more about this topic: Additive White Gaussian Noise
Famous quotes containing the words channel and/or capacity:
“The Xanthus or Scamander is not a mere dry channel and bed of a mountain torrent, but fed by the ever-flowing springs of fame ... and I trust that I may be allowed to associate our muddy but much abused Concord River with the most famous in history.”
—Henry David Thoreau (18171862)
“Modern mans capacity for destruction is quixotic evidence of humanitys capacity for reconstruction. The powerful technological agents we have unleashed against the environment include many of the agents we require for its reconstruction.”
—George F. Will (b. 1941)