Joint Source-Channel Coding for Gaussian Sources over AWGN Channels using Variational Autoencoders

In this paper, we study joint source-channel coding of gaussian sources over multiple AWGN channels where the source dimension is greater than the number of channels. We model our system as a Variational Autoencoder and show that its loss function takes up a form that is an upper bound on the optimization function got from rate-distortion theory. The constructed system employs two encoders that learn to split the source input space into almost half with no constraints. The system is jointly trained in a data-driven manner, end-to-end. We achieve state of the art results for certain configurations, some of which are 0.7dB better than previous works. We also showcase that the trained encoder/decoder is robust, i.e., even if the channel conditions change by +/-5dB, the performance of the system does not vary by more than 0.7dB w.r.t. a system trained at that channel condition. The trained system, to an extent, has the ability to generalize when a single input dimension is dropped and for some scenarios it is less than 1dB away from the system trained for that reduced dimension.

[1]  Tor A. Ramstad,et al.  Shannon-kotel-nikov mappings in joint source-channel coding , 2009, IEEE Transactions on Communications.

[2]  P.A. Floor,et al.  Dimension Reducing Mappings in Joint Source-Channel Coding , 2006, Proceedings of the 7th Nordic Signal Processing Symposium - NORSIG 2006.

[3]  Valero Laparra,et al.  End-to-end Optimized Image Compression , 2016, ICLR.

[4]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[5]  Vinay A. Vaishampayan Combined Source-Channel Coding for Bandlimited Waveform Channels , 1989 .

[6]  Kenneth Rose,et al.  On Zero-Delay Source-Channel Coding , 2014, IEEE Transactions on Information Theory.

[7]  Tsachy Weissman,et al.  NECST: Neural Joint Source-Channel Coding , 2018, ArXiv.

[8]  Tor A. Ramstad,et al.  Bandwidth compression for continuous amplitude channels based on vector approximation to a continuous subset of the source signal space , 1997, 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[9]  Yichuan Hu,et al.  Analog Joint Source-Channel Coding Using Non-Linear Curves and MMSE Decoding , 2011, IEEE Transactions on Communications.

[10]  C.E. Shannon,et al.  Communication in the Presence of Noise , 1949, Proceedings of the IRE.

[11]  Kyong-Hwa Lee,et al.  Optimal Linear Coding for Vector Channels , 1976, IEEE Trans. Commun..

[12]  Deniz Gündüz,et al.  Deep Joint Source-channel Coding for Wireless Image Transmission , 2018, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[13]  Tor A. Ramstad,et al.  Using 2:1 Shannon mapping for joint source-channel coding , 2005, Data Compression Conference.

[14]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[15]  Faramarz Fekri,et al.  M to 1 Joint Source-Channel Coding of Gaussian Sources via Dichotomy of the Input Space Based on Deep Learning , 2019, 2019 Data Compression Conference (DCC).

[16]  Andrea J. Goldsmith,et al.  Deep Learning for Joint Source-Channel Coding of Text , 2018, 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).