Channel Capacity
The AWGN channel is represented by a series of outputs at discrete time event index . is the sum of the input and noise, where is independent and identically distributed and drawn from a zero-mean normal distribution with variance (the noise). The are further assumed to not be correlated with the .
The capacity of the channel is infinite unless the noise n is nonzero, and the are sufficiently constrained. The most common constraint on the input is the so-called "power" constraint, requiring that for a codeword transmitted through the channel, we have:
where represents the maximum channel power. Therefore, the channel capacity for the power-constrained channel is given by:
Where is the distribution of . Expand, writing it in terms of the differential entropy:
But and are independent, therefore:
Evaluating the differential entropy of a Gaussian gives:
Because and are independent and their sum gives :
From this bound, we infer from a property of the differential entropy that
Therefore the channel capacity is given by the highest achievable bound on the mutual information:
Where is maximized when:
Thus the channel capacity for the AWGN channel is given by:
Read more about this topic: Additive White Gaussian Noise
Famous quotes containing the words channel and/or capacity:
“For, rightly, every man is a channel through which heaven floweth, and, whilst I fancied I was criticising him, I was censuring or rather terminating my own soul.”
—Ralph Waldo Emerson (18031882)
“The desert is a natural extension of the inner silence of the body. If humanitys language, technology, and buildings are an extension of its constructive faculties, the desert alone is an extension of its capacity for absence, the ideal schema of humanitys disappearance.”
—Jean Baudrillard (b. 1929)