A new Type Size code for universal one-to-one compression of parametric sources

We consider universal source coding of an exponential family of i.i.d. distributions for short blocklengths. We present a variation of the previously introduced Type Size code, in which type classes are characterized based on the neighborhoods of the minimal sufficient statistics. We show that, there is no loss in dispersion compared to the non-universal setup. We identify the third-order coding rate of this variation of the Type Size code for compression of such parametric sources.

[1]  Sergio Verdú,et al.  Optimal Lossless Data Compression: Non-Asymptotics and Asymptotics , 2014, IEEE Transactions on Information Theory.

[2]  Jorma Rissanen,et al.  Strong optimality of the normalized ML models as universal codes and information in data , 2001, IEEE Trans. Inf. Theory.

[3]  Jorma Rissanen,et al.  Fisher information and stochastic complexity , 1996, IEEE Trans. Inf. Theory.

[4]  Pedro Larrañaga,et al.  An Introduction to Probabilistic Graphical Models , 2002, Estimation of Distribution Algorithms.

[5]  Sebastian Tschiatschek,et al.  Introduction to Probabilistic Graphical Models , 2014 .

[6]  Oliver Kosut,et al.  New results on third-order coding rate for universal fixed-to-variable source coding , 2014, 2014 IEEE International Symposium on Information Theory.

[7]  Vincent Yan Fu Tan,et al.  Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities , 2014, Found. Trends Commun. Inf. Theory.

[8]  Neri Merhav,et al.  On universal simulation of information sources using training data , 2004, IEEE Transactions on Information Theory.

[9]  Charles Stone,et al.  On local and ratio limit theorems , 1967 .

[10]  Wojciech Szpankowski,et al.  A One-to-One Code and Its Anti-Redundancy , 2008, IEEE Transactions on Information Theory.

[11]  Oliver Kosut,et al.  Universal fixed-to-variable source coding in the finite blocklength regime , 2013, 2013 IEEE International Symposium on Information Theory.

[12]  Toshiyasu Matsushima,et al.  Fundamental limit and pointwise asymptotics of the Bayes code for Markov sources , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[13]  Faramarz Fekri,et al.  Fundamental limits of universal lossless one-to-one compression of parametric sources , 2014, 2014 IEEE Information Theory Workshop (ITW 2014).

[14]  Toshiyasu Matsushima,et al.  Evaluation of the minimum overflow threshold of bayes codes for a Markov source , 2014, 2014 International Symposium on Information Theory and its Applications.

[15]  Wojciech Szpankowski,et al.  Minimum Expected Length of Fixed-to-Variable Lossless Compression Without Prefix Constraints , 2011, IEEE Transactions on Information Theory.

[16]  Oliver Kosut,et al.  Asymptotics and Non-Asymptotics for Universal Fixed-to-Variable Source Coding , 2014, IEEE Transactions on Information Theory.

[17]  Oliver Kosut,et al.  Third-order coding rate for universal compression of Markov sources , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).