Optimum one-bit quantization

This paper deals with discrete input one-bit output quantization. A discrete input signal is subject to additive noise and is then quantized to zero or one by comparison with a threshold q. For finitely many fixed support points and fixed threshold q we first determine the mutual information of this channel. The capacity-achieving input distribution is shown to be concentrated on merely two extreme support points. Furthermore, an elegant representations of the corresponding probabilities is found. Finally, we set out to determine the optimum threshold q, which is an extremely hard problem. By means of graphical representations a completely different behavior of the objective function is revealed, depending on the choice of parameters and the noise distribution.

[1]  Alessandro Magnani,et al.  Optimal one-bit quantization , 2005, Data Compression Conference.

[2]  S. Muroga On the Capacity of a Discrete Channel. I Mathematical expression of capacity of a channel which is disturbed by noise in its every one symbol and expressible in one state diagram , 1953 .

[3]  Richard E. Blahut,et al.  Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.

[4]  Upamanyu Madhow,et al.  On the limits of communication with low-precision analog-to-digital conversion at the receiver , 2009, IEEE Transactions on Communications.

[5]  M. Abramowitz,et al.  Handbook of Mathematical Functions With Formulas, Graphs and Mathematical Tables (National Bureau of Standards Applied Mathematics Series No. 55) , 1965 .

[6]  Rudolf Mathar,et al.  Threshold optimization for capacity-achieving discrete input one-bit output quantization , 2013, 2013 IEEE International Symposium on Information Theory.

[7]  Gerhard Fettweis,et al.  Fading channels with 1-bit output quantization: Optimal modulation, ergodic capacity and outage probability , 2010, 2010 IEEE Information Theory Workshop.

[8]  Seiji Takano On a Method of Calculating the Capacity of a Discrete Memoryless Channel , 1975, Inf. Control..

[9]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[10]  Suguru Arimoto,et al.  An algorithm for computing the capacity of arbitrary discrete memoryless channels , 1972, IEEE Trans. Inf. Theory.

[11]  Amos Lapidoth,et al.  Asymmetric quantizers are better at low SNR , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[12]  Amos Lapidoth,et al.  At Low SNR, Asymmetric Quantizers are Better , 2012, IEEE Transactions on Information Theory.

[13]  Pierre-Olivier Amblard,et al.  Revisiting the asymmetric binary channel: joint noise-enhanced detection and information transmission through threshold devices , 2005, SPIE International Symposium on Fluctuations and Noise.

[14]  Milton Abramowitz,et al.  Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables , 1964 .

[15]  Rudolf Mathar,et al.  A bio-inspired approach to condensing information , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.