Dimensions of Semantic Coding: Explicit and Implicit

Recent advances in deep learning have led to increased interest in solving high-efficiency end-to-end transmission problems using methods that employ the nonlinear property of neural networks. These methods, we call semantic coding, extract semantic features of the source signal across space and time, and design source-channel coding methods to transmit these features over wireless channels. Rapid progress has led to numerous research papers, but a consolidation of the discovered knowledge has not yet emerged. In this article, we gather ideas to categorize the expansive aspects on semantic coding as two paradigms, i.e., explicit and implicit semantic coding. We first focus on those two paradigms of semantic coding by identifying their common and different components in building semantic communication systems. We then focus on the applications of semantic coding to different transmission tasks. Our article highlights the improved quality, flexibility, and capability brought by semantic coded transmission. Finally, we point out future directions.

[1]  Harpreet S. Dhillon,et al.  Beyond Transmitting Bits: Context, Semantics, and Task-Oriented Communications , 2022, IEEE Journal on Selected Areas in Communications.

[2]  Geoffrey Y. Li,et al.  Semantic Communications: Principles and Challenges , 2021, ArXiv.

[3]  Zhongwei Si,et al.  Nonlinear Transform Source-Channel Coding for Semantic Communications , 2021, IEEE Journal on Selected Areas in Communications.

[4]  Federico Tombari,et al.  Neural Fields in Visual Computing and Beyond , 2021, Comput. Graph. Forum.

[5]  Fangwei Zhang,et al.  Toward Wisdom-Evolutionary and Primitive-Concise 6G:A New Paradigm of Semantic Communication Networks , 2021, Engineering.

[6]  Gary J. Sullivan,et al.  Developments in International Video Coding Standardization After AVC, With an Overview of Versatile Video Coding (VVC) , 2021, Proceedings of the IEEE.

[7]  Eirikur Agustsson,et al.  Nonlinear Transform Coding , 2020, IEEE Journal of Selected Topics in Signal Processing.

[8]  Jonathan T. Barron,et al.  Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains , 2020, NeurIPS.

[9]  Gordon Wetzstein,et al.  Implicit Neural Representations with Periodic Activation Functions , 2020, NeurIPS.

[10]  Pratul P. Srinivasan,et al.  NeRF , 2020, ECCV.

[11]  Richard A. Newcombe,et al.  DeepSDF: Learning Continuous Signed Distance Functions for Shape Representation , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[12]  Shrinivas Kudekar,et al.  Design of Low-Density Parity Check Codes for 5G New Radio , 2018, IEEE Communications Magazine.

[13]  Claude E. Shannon,et al.  Recent Contributions to The Mathematical Theory of Communication , 2009 .