Second-Order Asymptotics for Communication Under Strong Asynchronism

The capacity under strong asynchronism was recently shown to be essentially unaffected by the imposed decoding delay—the elapsed time between when information is available at the transmitter and when it is decoded—and the output sampling rate. This paper shows that, in contrast with capacity, the second-order term in the maximum rate expansion is sensitive to both parameters. When the receiver must locate the sent codeword exactly and therefore achieve minimum delay equal to the blocklength <inline-formula> <tex-math notation="LaTeX">$n$ </tex-math></inline-formula>, the second-order term in the maximum rate expansion is of order <inline-formula> <tex-math notation="LaTeX">$\Theta (1/\rho)$ </tex-math></inline-formula> for any sampling rate <inline-formula> <tex-math notation="LaTeX">$\rho =O(1/\sqrt {n})$ </tex-math></inline-formula> (and <inline-formula> <tex-math notation="LaTeX">$\rho =\omega (1/n)$ </tex-math></inline-formula> for otherwise reliable communication is impossible). Instead, if <inline-formula> <tex-math notation="LaTeX">$\rho =\omega (1/\sqrt {n})$ </tex-math></inline-formula>, then the second-order term is the same as under full sampling and is given by a standard <inline-formula> <tex-math notation="LaTeX">$\Theta (\sqrt {n})$ </tex-math></inline-formula> term. However, if the delay constraint is only slightly relaxed to <inline-formula> <tex-math notation="LaTeX">$n(1+o(1))$ </tex-math></inline-formula>, then the above order transition (for <inline-formula> <tex-math notation="LaTeX">$\rho =O(1/\sqrt {n})$ </tex-math></inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\rho =\omega (1/\sqrt {n})$ </tex-math></inline-formula>) vanishes and the second-order term remains the same as under full sampling for any <inline-formula> <tex-math notation="LaTeX">$\rho =\omega (1/n)$ </tex-math></inline-formula>.

[1]  Giuseppe Caire,et al.  Energy and Sampling Constrained Asynchronous Communication , 2013, IEEE Transactions on Information Theory.

[2]  H. Vincent Poor,et al.  Channel coding: non-asymptotic fundamental limits , 2010 .

[3]  Boris Murmann,et al.  Power Dissipation Bounds for High-Speed Nyquist Analog-to-Digital Converters , 2009, IEEE Transactions on Circuits and Systems I: Regular Papers.

[4]  Yury Polyanskiy Asynchronous Communication: Exact Synchronization, Universality, and Dispersion , 2013, IEEE Transactions on Information Theory.

[5]  Aslan Tchamkerten,et al.  Asynchronous capacity per unit cost under a receiver sampling constraint , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[6]  Neri Merhav,et al.  Channel Detection in Coded Communication , 2015, IEEE Transactions on Information Theory.

[7]  Aslan Tchamkerten,et al.  Sampling Constrained Asynchronous Communication: How to Sleep Efficiently , 2015, IEEE Transactions on Information Theory.

[8]  Gregory W. Wornell,et al.  Information Theoretic Perspectives on Synchronization , 2006, 2006 IEEE International Symposium on Information Theory.

[9]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[10]  Ilan Shomorony,et al.  Bounds on the minimum energy-per-bit for bursty traffic in diamond networks , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[11]  Sae-Young Chung,et al.  Error exponents in asynchronous communication , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[12]  Gregory W. Wornell,et al.  Optimal Sequential Frame Synchronization , 2007, IEEE Transactions on Information Theory.

[13]  Gregory W. Wornell,et al.  Asynchronous Communication: Capacity Bounds and Suboptimality of Training , 2013, IEEE Transactions on Information Theory.

[14]  Robert M. Gray,et al.  Sliding-block joint source/noisy-channel coding theorems , 1976, IEEE Trans. Inf. Theory.

[15]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[16]  Rajeev Motwani,et al.  Randomized algorithms , 1996, CSUR.

[17]  R. Gallager Information Theory and Reliable Communication , 1968 .

[18]  Daniela Tuninetti,et al.  On the capacity of strong asynchronous multiple access channels with a large number of users , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[19]  David Tse,et al.  Asynchronous Capacity per Unit Cost , 2010, IEEE Transactions on Information Theory.

[20]  Gregory W. Wornell,et al.  Communication Under Strong Asynchronism , 2007, IEEE Transactions on Information Theory.

[21]  Neri Merhav,et al.  Codeword or Noise? Exact Random Coding Exponents for Joint Detection and Decoding , 2014, IEEE Transactions on Information Theory.