Concatenated Classic and Neural (CCN) Codes: ConcatenatedAE

Small neural networks (NNs) used for error correction were shown to improve on classic channel codes and to address channel model changes. We extend the code dimension of any such structure by using the same NN under one-hot encoding multiple times, then serially-concatenated with an outer classic code. We design NNs with the same network parameters, where each Reed-Solomon codeword symbol is an input to a different NN. Significant improvements in block error probabilities for an additive Gaussian noise channel as compared to the small neural code are illustrated, as well as robustness to channel model changes.

[1]  Sang-Woon Jeon,et al.  Hybrid Neural Coded Modulation: Design and Training Methods , 2022, ICT Express.

[2]  Mohammad Vahid Jamali,et al.  ProductAE: Toward Training Larger Channel Codes based on Neural Product Codes , 2021, ICC 2022 - IEEE International Conference on Communications.

[3]  Sewoong Oh,et al.  KO codes: inventing nonlinear encoding and decoding for reliable wireless communication via deep-learning , 2021, ICML.

[4]  Stephan ten Brink,et al.  Serial vs. Parallel Turbo-Autoencoders and Accelerated Training for Learned Channel Codes , 2021, 2021 11th International Symposium on Topics in Coding (ISTC.

[5]  Wei Liu,et al.  The Stability of Low-Density Parity-Check Codes and Some of its Consequences , 2020, IEEE Transactions on Information Theory.

[6]  Fayçal Ait Aoudia,et al.  Trainable Communication Systems: Concepts and Prototype , 2019, IEEE Transactions on Communications.

[7]  P. Viswanath,et al.  Turbo Autoencoder: Deep learning based channel codes for point-to-point communication channels , 2019, NeurIPS.

[8]  Erdal Arikan,et al.  From sequential decoding to channel polarization and back again , 2019, ArXiv.

[9]  Timothy Dozat,et al.  Incorporating Nesterov Momentum into Adam , 2016 .

[10]  Tomaso Erseghe,et al.  Coding in the Finite-Blocklength Regime: Bounds Based on Laplace Integrals and Their Asymptotic Approximations , 2015, IEEE Transactions on Information Theory.

[11]  Alexander Vardy,et al.  List decoding of polar codes , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[12]  S. Verdú,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[13]  Erdal Arikan,et al.  Channel Polarization: A Method for Constructing Capacity-Achieving Codes for Symmetric Binary-Input Memoryless Channels , 2008, IEEE Transactions on Information Theory.

[14]  Shu Lin,et al.  Error control coding , 2004 .

[15]  I. Reed,et al.  Polynomial Codes Over Certain Finite Fields , 1960 .

[16]  G. David Forney,et al.  Concatenated codes , 2009, Scholarpedia.

[17]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .