Dynamic Programming for Discrete Memoryless Channel Quantization

In this paper, we present a general framework for applying dynamic programming (DP) to discrete memoryless channel (DMC) quantization. The DP has complexity $O(q (N-M)^2 M)$, where $q$, $N$, and $M$ are alphabet sizes of the DMC input, DMC output, and the quantizer output, respectively. Then, starting from the quadrangle inequality (QI), we apply two techniques to reduce the DP's complexity. One technique makes use of the SMAWK algorithm with complexity $O(q (N-M) M)$, while the other technique is much easier to be implemented and has complexity $O(q (N^2 - M^2))$. Next, we give a sufficient condition on the channel transition probability, under which the two low-complexity techniques can be applied for designing quantizers that maximize the $\alpha$-mutual information, which is a generalized objective function for channel quantization. This condition works for the general $q$-ary input case, including the previous work for $q = 2$ as a subcase. Moreover, we propose a new idea, called iterative DP (IDP). Theoretical analysis and simulation results demonstrate that IDP can improve the quantizer design over the state-of-the-art methods in the literature.

[1]  Alexander Vardy,et al.  How to Construct Polar Codes , 2011, IEEE Transactions on Information Theory.

[2]  Bobak Nazer,et al.  Information-distilling quantizers , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[3]  Clifford Stein,et al.  Introduction to Algorithms, 2nd edition. , 2001 .

[4]  Gerald Matz,et al.  On irregular LDPC codes with quantized message passing decoding , 2017, 2017 IEEE 18th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC).

[5]  F. Frances Yao,et al.  Efficient dynamic programming using quadrangle inequalities , 1980, STOC '80.

[6]  Ken-ichi Iwata,et al.  Suboptimal quantizer design for outputs of discrete memoryless channels with a finite-input alphabet , 2014, 2014 International Symposium on Information Theory and its Applications.

[7]  V. D. Pietra,et al.  Minimum Impurity Partitions , 1992 .

[8]  Ken-ichi Iwata,et al.  Optimal quantization of B-DMCs maximizing α-mutual information with monge property , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[9]  Sergio Verdú,et al.  Convexity/concavity of renyi entropy and α-mutual information , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[10]  Brian M. Kurkoski,et al.  Noise Thresholds for Discrete LDPC Decoding Mappings , 2008, IEEE GLOBECOM 2008 - 2008 IEEE Global Telecommunications Conference.

[11]  Marco Molinaro,et al.  Binary Partitions with Approximate Minimum Impurity , 2018, ICML.

[12]  Lawrence L. Larmore,et al.  The Knuth-Yao quadrangle-inequality speedup is a consequence of total-monotonicity , 2009, SODA '06.

[13]  Gerhard Bauch,et al.  Information-Optimum LDPC Decoders Based on the Information Bottleneck Method , 2018, IEEE Access.

[14]  Brian M. Kurkoski,et al.  Single-bit quantization of binary-input, continuous-output channels , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[15]  Brian M. Kurkoski,et al.  Quantization of Binary-Input Discrete Memoryless Channels , 2011, IEEE Transactions on Information Theory.

[16]  Ken-ichi Iwata,et al.  Quantizer design for outputs of binary-input discrete memoryless channels using SMAWK algorithm , 2014, 2014 IEEE International Symposium on Information Theory.

[17]  Alok Aggarwal,et al.  Geometric Applications of a Matrix Searching Algorithm , 1986, Symposium on Computational Geometry.

[18]  Richard D. Wesel,et al.  Enhanced Precision Through Multiple Reads for LDPC Decoding in Flash Memories , 2013, IEEE Journal on Selected Areas in Communications.

[19]  Sergio Verdú α-mutual information , 2015, 2015 Information Theory and Applications Workshop (ITA).

[20]  Brian M. Kurkoski,et al.  LDPC Decoding Mappings That Maximize Mutual Information , 2016, IEEE Journal on Selected Areas in Communications.

[21]  Brian M. Kurkoski,et al.  Low-complexity quantization of discrete memoryless channels , 2016, 2016 International Symposium on Information Theory and Its Applications (ISITA).

[22]  Long Shi,et al.  Information Theoretic Bounds Based Channel Quantization Design for Emerging Memories , 2018, 2018 IEEE Information Theory Workshop (ITW).

[23]  I. Csiszár Generalized Cutoff Rates and Renyi's Information Measures , 1993, Proceedings. IEEE International Symposium on Information Theory.