Channel Capacity
The AWGN channel is represented by a series of outputs at discrete time event index . is the sum of the input and noise, where is independent and identically distributed and drawn from a zero-mean normal distribution with variance (the noise). The are further assumed to not be correlated with the .
The capacity of the channel is infinite unless the noise n is nonzero, and the are sufficiently constrained. The most common constraint on the input is the so-called "power" constraint, requiring that for a codeword transmitted through the channel, we have:
where represents the maximum channel power. Therefore, the channel capacity for the power-constrained channel is given by:
Where is the distribution of . Expand, writing it in terms of the differential entropy:
But and are independent, therefore:
Evaluating the differential entropy of a Gaussian gives:
Because and are independent and their sum gives :
From this bound, we infer from a property of the differential entropy that
Therefore the channel capacity is given by the highest achievable bound on the mutual information:
Where is maximized when:
Thus the channel capacity for the AWGN channel is given by:
Read more about this topic: Additive White Gaussian Noise
Famous quotes containing the words channel and/or capacity:
“Eddie did not die. He is no longer on Channel 4, and our sets are tuned to Channel 4; hes on Channel 7, but hes still broadcasting. Physical incarnation is highly overrated; it is one corner of universal possibility.”
—Marianne Williamson (b. 1953)
“If your child is going to develop a healthy personality with the capacity to remain intact and grow, she must learn how to test reality, regulate her impulses, stabilize her moods, integrate her feelings and actions, focus her concentration and plan.”
—Stanley I. Greenspan (20th century)