LEARN Codes: Inventing Low-Latency Codes via Recurrent Neural Networks

Designing channel codes under low latency constraints is one of the most demanding requirements in 5G standards. However, sharp characterizations of the performances of traditional codes are only available in the large block lengths limit. Code designs are guided by those asymptotic analyses and require large block lengths and long latency to achieve the desired error rate. Furthermore, when the codes designed for one channel (e.g. Additive White Gaussian Noise (AWGN) channel) are used for another (e.g. non-AWGN channels), heuristics are necessary to achieve any non trivial performance — thereby severely lacking in robustness as well as adaptivity. Obtained by jointly designing recurrent neural network (RNN) based encoder and decoder, we propose an end-to-end learned neural code which outperforms canonical convolutional code under block settings. With this gained experience of designing a novel neural block code, we propose a new class of codes under low latency constraint — Low-latency Efficient Adaptive Robust Neural (LEARN) codes, which outperform the state-of-the-art low latency codes as well as exhibit robustness and adaptivity properties. LEARN codes show the potential of designing new versatile and universal codes for future communications via tools of modern deep learning coupled with communication engineering insights.

[1]  Stephan ten Brink,et al.  Scaling Deep Learning-Based Decoding of Polar Codes via Partitioning , 2017, GLOBECOM 2017 - 2017 IEEE Global Communications Conference.

[2]  Yair Be'ery,et al.  Learning to decode linear codes using deep learning , 2016, 2016 54th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[3]  Jakob Hoydis,et al.  End-to-End Learning of Communications Systems Without a Channel Model , 2018, 2018 52nd Asilomar Conference on Signals, Systems, and Computers.

[4]  A. Buja,et al.  Loss Functions for Binary Class Probability Estimation and Classification: Structure and Applications , 2005 .

[5]  Stephan ten Brink,et al.  Deep Learning Based Communication Over the Air , 2017, IEEE Journal of Selected Topics in Signal Processing.

[6]  Andrew J. Viterbi,et al.  Error bounds for convolutional codes and an asymptotically optimum decoding algorithm , 1967, IEEE Trans. Inf. Theory.

[7]  Takio Kurita,et al.  Effect of Additive Noise for Multi-Layered Perceptron with AutoEncoders , 2017, IEICE Trans. Inf. Syst..

[8]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[9]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[10]  Surya Ganguli,et al.  Analyzing noise in autoencoders and deep networks , 2014, ArXiv.

[11]  E. Arkan,et al.  A performance comparison of polar codes and Reed-Muller codes , 2008, IEEE Communications Letters.

[12]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[13]  Sreeram Kannan,et al.  Deepcode: Feedback Codes via Deep Learning , 2018, IEEE Journal on Selected Areas in Information Theory.

[14]  Branka Vucetic,et al.  Short Block-Length Codes for Ultra-Reliable Low Latency Communications , 2019, IEEE Communications Magazine.

[15]  Jakob Hoydis,et al.  An Introduction to Deep Learning for the Physical Layer , 2017, IEEE Transactions on Cognitive Communications and Networking.

[16]  Yoshua Bengio,et al.  Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.

[17]  Sumit Roy,et al.  Impact and mitigation of narrow-band radar interference in down-link LTE , 2015, 2015 IEEE International Conference on Communications (ICC).

[18]  Pascal Vincent,et al.  Contractive Auto-Encoders: Explicit Invariance During Feature Extraction , 2011, ICML.

[19]  Stephan ten Brink,et al.  OFDM-Autoencoder for End-to-End Learning of Communications Systems , 2018, 2018 IEEE 19th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC).

[20]  Daniel J. Costello,et al.  Low Latency Coding: Convolutional Codes vs. LDPC Codes , 2012, IEEE Transactions on Communications.

[21]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[22]  Himanshu Asnani,et al.  LEARN Codes: Inventing Low-Latency Codes via Recurrent Neural Networks , 2020, IEEE Journal on Selected Areas in Information Theory.

[23]  Jasper Snoek,et al.  Practical Bayesian Optimization of Machine Learning Algorithms , 2012, NIPS.

[24]  David James Love,et al.  Concatenated Coding for the AWGN Channel With Noisy Feedback , 2009, IEEE Transactions on Information Theory.

[25]  Sergey Ioffe,et al.  Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift , 2015, ICML.

[26]  Marco Chiani,et al.  Short codes with mismatched channel state information: A case study , 2017, 2017 IEEE 18th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC).

[27]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[28]  Gianluigi Liva,et al.  Code Design for Short Blocks: A Survey , 2016, ArXiv.

[29]  Sreeram Kannan,et al.  Communication Algorithms via Deep Learning , 2018, ICLR.

[30]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[31]  A. Glavieux,et al.  Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1 , 1993, Proceedings of ICC '93 - IEEE International Conference on Communications.

[32]  Barbara Hammer,et al.  On the approximation capability of recurrent neural networks , 2000, Neurocomputing.

[33]  Rüdiger L. Urbanke,et al.  Modern Coding Theory , 2008 .

[34]  David Burshtein,et al.  Deep Learning Methods for Improved Decoding of Linear Codes , 2017, IEEE Journal of Selected Topics in Signal Processing.

[35]  Geoffrey A. Sanders Effects of Radar Interference on LTE (FDD) eNodeB and UE Receiver Performance in the 3.5 GHz Band , 2014 .

[36]  Karin Strauss,et al.  Accelerating Deep Convolutional Neural Networks Using Specialized Hardware , 2015 .

[37]  Krzysztof Wesolowski,et al.  Channel Coding for Ultra-Reliable Low-Latency Communication in 5G Systems , 2016, 2016 IEEE 84th Vehicular Technology Conference (VTC-Fall).

[38]  Thomas Kailath,et al.  A coding scheme for additive noise channels with feedback-I: No bandwidth constraint , 1966, IEEE Trans. Inf. Theory.

[39]  Quoc V. Le,et al.  Don't Decay the Learning Rate, Increase the Batch Size , 2017, ICLR.

[40]  Ralf R. Müller,et al.  Comparison of Convolutional and Block Codes for Low Structural Delay , 2015, IEEE Transactions on Communications.

[41]  P. Viswanath,et al.  Turbo Autoencoder: Deep learning based channel codes for point-to-point communication channels , 2019, NeurIPS.

[42]  Timothy J. O'Shea,et al.  Deep Learning Based MIMO Communications , 2017, ArXiv.

[43]  Song Han,et al.  Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding , 2015, ICLR.

[44]  Elad Hoffer,et al.  Train longer, generalize better: closing the generalization gap in large batch training of neural networks , 2017, NIPS.

[45]  Kiran Karra,et al.  Learning to communicate: Channel auto-encoders, domain specific regularizers, and attention , 2016, 2016 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT).

[46]  Radford M. Neal,et al.  Near Shannon limit performance of low density parity check codes , 1996 .

[47]  Stephan ten Brink,et al.  On deep learning-based channel decoding , 2017, 2017 51st Annual Conference on Information Sciences and Systems (CISS).

[48]  Andrea Goldsmith,et al.  Neural Network Detection of Data Sequences in Communication Systems , 2018, IEEE Transactions on Signal Processing.