Gaussian Wiretap Channel With Amplitude and Variance Constraints

We consider the Gaussian wiretap channel with amplitude and variance constraints on the channel input. We first show that the entire rate-equivocation region of the Gaussian wiretap channel with an amplitude constraint is obtained by discrete input distributions with finite support. We prove this result by considering the existing single-letter description of the rate-equivocation region, and showing that discrete distributions with finite support exhaust this region. Our result highlights an important difference between the peak power (amplitude) constrained and the average power (variance) constrained cases. Although, in the average power constrained case, both the secrecy capacity and the capacity can be achieved simultaneously, our results show that in the peak power constrained case, in general, there is a tradeoff between the secrecy capacity and the capacity, in the sense that, both may not be achieved simultaneously. We also show that under sufficiently small amplitude constraints the possible tradeoff between the secrecy capacity and the capacity does not exist and they are both achieved by the symmetric binary distribution. Finally, we prove the optimality of discrete input distributions in the presence of an additional variance constraint.

[1]  G. Casella,et al.  Estimating a Bounded Normal Mean , 1981 .

[2]  A. D. Wyner,et al.  The wire-tap channel , 1975, The Bell System Technical Journal.

[3]  Ibrahim C. Abou-Faycal,et al.  The capacity of discrete-time memoryless Rayleigh-fading channels , 2001, IEEE Trans. Inf. Theory.

[4]  Sean P. Meyn,et al.  Characterization and computation of optimal distributions for channel coding , 2005, IEEE Transactions on Information Theory.

[5]  Aslan Tchamkerten,et al.  On the discreteness of capacity-achieving distributions , 2004, IEEE Transactions on Information Theory.

[6]  H. Vincent Poor,et al.  The noncoherent rician fading Channel-part I: structure of the capacity-achieving input , 2005, IEEE Transactions on Wireless Communications.

[7]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[8]  Martin E. Hellman,et al.  The Gaussian wire-tap channel , 1978, IEEE Trans. Inf. Theory.

[9]  Joel G. Smith,et al.  The Information Capacity of Amplitude- and Variance-Constrained Scalar Gaussian Channels , 1971, Inf. Control..

[10]  Marten van Dijk On a special class of broadcast channels with confidential messages , 1997, IEEE Trans. Inf. Theory.

[11]  Shlomo Shamai,et al.  Transition points in the capacity-achieving distribution for the peak-power limited AWGN and free-space optical intensity channels , 2010, Probl. Inf. Transm..

[12]  Frank R. Kschischang,et al.  Capacity-achieving probability measure for conditionally Gaussian channels with bounded inputs , 2005, IEEE Transactions on Information Theory.

[13]  Imre Csiszár,et al.  Broadcast channels with confidential messages , 1978, IEEE Trans. Inf. Theory.

[14]  Shlomo Shamai,et al.  Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error , 2010, IEEE Transactions on Information Theory.

[15]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[16]  Dongning Guo,et al.  Capacity of Gaussian Channels With Duty Cycle and Power Constraints , 2012, IEEE Transactions on Information Theory.

[17]  M. Raginsky,et al.  On the information capacity of Gaussian channels under small peak power constraints , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.

[18]  Shlomo Shamai,et al.  The capacity of average and peak-power-limited quadrature Gaussian channels , 1995, IEEE Trans. Inf. Theory.