Neural Belief Propagation Decoding of CRC-Polar Concatenated Codes

Polar codes are the first class of error correcting codes that provably achieve the channel capacity at infinite code length. They were selected for use in the fifth generation of cellular mobile communications (5G). In practical scenarios such as 5G, a cyclic redundancy check (CRC) is concatenated with polar codes to improve their finite length performance. This is mostly beneficial for sequential successive-cancellation list decoders. However, for parallel iterative belief propagation (BP) decoders, CRC is only used as an early stopping criterion with incremental error-correction performance improvement. In this paper, we first propose a CRC-polar BP (CPBP) decoder by exchanging the extrinsic information between the factor graph of the polar code and that of the CRC. We then propose a neural CPBP (NCPBP) algorithm which improves the CPBP decoder by introducing trainable normalizing weights on the concatenated factor graph. Our results on a 5G polar code of length 128 show that at the frame error rate of 10<sup>−5</sup> and with a maximum of 30 iterations, the error-correction performance of CPBP and NCPBP are approximately 0.25 dB and 0.5 dB better than that of the conventional CRC-aided BP decoder, respectively, while introducing almost no latency overhead.

[1]  Paul H. Siegel,et al.  Enhanced belief propagation decoding of polar codes through concatenation , 2014, 2014 IEEE International Symposium on Information Theory.

[2]  Xiaohu You,et al.  Efficient early termination schemes for belief-propagation decoding of polar codes , 2015, 2015 IEEE 11th International Conference on ASIC (ASICON).

[3]  Stephan ten Brink,et al.  Improving Belief Propagation decoding of polar codes using scattered EXIT charts , 2016, 2016 IEEE Information Theory Workshop (ITW).

[4]  E. Arıkan Polar codes : A pipelined implementation , 2010 .

[5]  Alexander Vardy,et al.  List decoding of polar codes , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[6]  Warren J. Gross,et al.  Neural offset min-sum decoding , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[7]  Stephan ten Brink,et al.  Belief propagation decoding of polar codes on permuted factor graphs , 2018, 2018 IEEE Wireless Communications and Networking Conference (WCNC).

[8]  Yuan Yu,et al.  TensorFlow: A system for large-scale machine learning , 2016, OSDI.

[9]  Warren J. Gross,et al.  On the Decoding of Polar Codes on Permuted Factor Graphs , 2018, 2018 IEEE Global Communications Conference (GLOBECOM).

[10]  Zhengya Zhang,et al.  Post-Processing Methods for Improving Coding Gain in Belief Propagation Decoding of Polar Codes , 2017, GLOBECOM 2017 - 2017 IEEE Global Communications Conference.

[11]  Xiaohu You,et al.  Improved polar decoder based on deep learning , 2017, 2017 IEEE International Workshop on Signal Processing Systems (SiPS).

[12]  David Burshtein,et al.  Deep Learning Methods for Improved Decoding of Linear Codes , 2017, IEEE Journal of Selected Topics in Signal Processing.

[13]  Chi-Ying Tsui,et al.  Concatenated LDPC-polar codes decoding through belief propagation , 2017, 2017 IEEE International Symposium on Circuits and Systems (ISCAS).

[14]  Erdal Arikan,et al.  Channel Polarization: A Method for Constructing Capacity-Achieving Codes for Symmetric Binary-Input Memoryless Channels , 2008, IEEE Transactions on Information Theory.

[15]  Rüdiger L. Urbanke,et al.  The capacity of low-density parity-check codes under message-passing decoding , 2001, IEEE Trans. Inf. Theory.