Duality-Based Bounds on Channel Capacity
Ph.D. Dissertation, ETH Zurich, January 2005
Abstract

In his famous 1948 paper "A Mathematical Theory of Communication," Claude E. Shannon derives the ultimate limit of reliable communication—the channel capacity —and gives a general expression of this limit as a function of the conditional probability distribution describing the channel. Unfortunately, this expression involves an optimization that is very difficult—if not infeasible—to evaluate analytically or numerically for most channels of interest. Therefore a great interest in good upper and lower bounds to channel capacity exists.

In this thesis a technique is proposed for the derivation of upper bounds on channel capacity. It is based on a dual expression for channel capacity where the maximization (of mutual information) over distributions on the channel input alphabet is replaced with a minimization (of average relative entropy) over distributions on the channel output alphabet. Every choice of an output distribution leads to an upper bound on mutual information. The chosen output distribution need not correspond to any distribution on the channel input. With a judicious choice of output distributions one can often derive tight upper bounds on channel capacity.

Further, a technique is proposed for the analysis of the asymptotic capacity of cost-constrained channels. The technique is based on the observation that—under fairly mild conditions on the channel—any input distribution that achieves a mutual information with the same growth-rate in the cost constraint as the channel capacity must escape to infinity ; i.e. , under such a distribution for any finite cost, the probability of the set of input symbols of lesser cost tends to zero as the cost constraint tends to infinity.

The above techniques are applied to various channels. Firstly, the channel capacity of three channels that model optical communication is investigated: the free-space optical intensity channel, an optical intensity channel with input-dependent noise, and the Poisson channel. In all three cases both a peak- and an average-power constraint are imposed where the average-to-peak-power ratio α∈(0,1] is to be kept constant. For the free-space optical intensity channel new firm ( i.e. , not asymptotic) upper and lower bounds are derived that coincide asymptotically for the power constraints tending to infinity where their ratio is held fixed. In the other two cases new firm lower bounds and asymptotic upper bounds are given. In the same asymptotic limit the bounds coincide so that for all three models the capacity is asymptotically precisely known.

Secondly, the above techniques are applied to multiple-antenna flat fading channels with memory where the fading process is assumed to be regular ( i.e. , of finite entropy rate) and where the realization of the fading process is unknown at the transmitter and unknown (or only partially known) at the receiver. It is demonstrated that, for high signal-to-noise ratios (SNR), the capacity of such channels grows only double-logarithmically in the SNR. To better understand this phenomenon and the rates at which it occurs, the fading number is introduced as the second term in the high-SNR asymptotic expansion of capacity, and its value or an estimate on it is derived for various fading channels. It is observed that at rates that are significantly higher than the fading number, communication becomes extremely power inefficient, thus posing a limit on practically achievable rates.

For various fading channels upper and lower bounds on the fading number are presented: for single-input multiple-output (SIMO) fading channels with memory the bounds coincide, thus yielding a complete characterization of the fading number for general stationary and ergodic fading processes with one transmit and one or more receive antennae. It is also demonstrated that for memoryless multiple-input single-output (MISO) fading channels, the fading number is achievable using beam-forming, and an expression for the optimal beam direction is derived. This direction depends on the fading law and is, in general, not the direction that maximizes the SNR on the induced single-input single-output (SISO) channel. Based on a new closed-form expression for the expectation of the logarithm of a non-central chi-square distributed random variable, some closed-form expressions for the fading number of various channels with Gaussian fading are provided, including SISO fading channels with stationary, ergodic, circularly symmetric, Gaussian fading. The fading number of the latter is determined by the fading mean, the fading variance, and the mean squared error in predicting the present fading from its past; it is not directly related to the Doppler spread.

Finally, in the case of regular multiple-input multiple-output (MIMO) fading channels with memory, it is shown that all corresponding bounds that have been derived in this thesis also hold true when there is an additional noiseless feedback link from the receiver to the transmitter. Further, in the asymptotic case, it is shown that the fading number of general SISO fading channels is not changed by feedback in spite of possible memory in the fading process. Moreover, for regular SISO fading channels with memory and partial receiver side-information it is shown that noiseless feedback and the disclosure of the partial side-information at the transmitter does not increase the fading number.

In the case of non-regular ( i.e. , of entropy rate being negative infinity) SISO Gaussian fading with memory, noiseless feedback is shown to have no impact on the capacity pre-log ( i.e. , the ratio of channel capacity to the logarithm of the SNR at high SNR).