Entropy and algorithmic complexity in quantum information theory

A theorem of Brudno says that the entropy production of classical ergodic information sources equals the algorithmic complexity per symbol of almost every sequence emitted by such sources. The recent advances in the theory and technology of quantum information raise the question whether a same relation may hold for ergodic quantum sources. In this paper, we discuss a quantum generalization of Brudno’s result which connects the von Neumann entropy rate and a recently proposed quantum algorithmic complexity.

[1]  David P. DiVincenzo,et al.  Quantum information and computation , 2000, Nature.

[2]  H. White Algorithmic complexity of points in dynamical systems , 1993, Ergodic Theory and Dynamical Systems.

[3]  Peter W. Shor,et al.  Algorithms for Quantum Computation: Discrete Log and Factoring (Extended Abstract) , 1994, FOCS 1994.

[4]  Péter Gács,et al.  Quantum algorithmic entropy , 2000, Proceedings 16th Annual IEEE Conference on Computational Complexity.

[5]  Paul M. B. Vitányi,et al.  Quantum Kolmogorov complexity based on classical descriptions , 2001, IEEE Trans. Inf. Theory.

[6]  Ray J. Solomonoff,et al.  A Formal Theory of Inductive Inference. Part I , 1964, Inf. Control..

[7]  En-Hui Yang,et al.  Universal compression of ergodic quantum sources , 2003, Quantum Inf. Comput..

[8]  Gregory J. Chaitin,et al.  On the Length of Programs for Computing Finite Binary Sequences , 1966, JACM.

[9]  D. Deutsch Quantum theory, the Church–Turing principle and the universal quantum computer , 1985, Proceedings of the Royal Society of London. A. Mathematical and Physical Sciences.

[10]  Cristian S. Calude Information and Randomness: An Algorithmic Perspective , 1994 .

[11]  Cristian S. Calude Information and Randomness , 1994, Monographs in Theoretical Computer Science An EATCS Series.

[12]  Roman Frigg,et al.  In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour?—Bridging the Gap Between Dynamical Systems Theory and Communication Theory , 2004, The British Journal for the Philosophy of Science.

[13]  H. Briegel,et al.  Algorithmic complexity and entanglement of quantum states. , 2005, Physical review letters.

[14]  V. Alekseev,et al.  Symbolic dynamics and hyperbolic dynamic systems , 1981 .

[15]  V. Uspenskii,et al.  Can an individual sequence of zeros and ones be random? Russian Math , 1990 .

[16]  P. Jorrand,et al.  Measurement-Based Quantum Turing Machines and their Universality , 2004, quant-ph/0404146.

[17]  Mark Fannes,et al.  Quantum Dynamical Systems , 2001 .

[18]  C. Mora,et al.  ALGORITHMIC COMPLEXITY OF QUANTUM STATES , 2004 .

[19]  John C. Kieffer,et al.  A unified approach to weak universal source coding , 1978, IEEE Trans. Inf. Theory.

[20]  Jozef Gruska,et al.  Quantum Computing , 2008, Wiley Encyclopedia of Computer Science and Engineering.

[21]  M. Horodecki,et al.  Universal Quantum Information Compression , 1998, quant-ph/9805017.

[22]  P. Rousseeuw,et al.  Wiley Series in Probability and Mathematical Statistics , 2005 .

[23]  Thomas M. Cover,et al.  Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing) , 2006 .

[24]  Markus Müller,et al.  Entropy and Quantum Kolmogorov Complexity: A Quantum Brudno’s Theorem , 2005, ArXiv.

[25]  A. Kolmogorov Three approaches to the quantitative definition of information , 1968 .

[26]  K. Svozil The quantum coin toss-testing microphysical undecidability , 1990 .

[27]  Igor Bjelakovic,et al.  The Shannon-McMillan theorem for ergodic quantum lattice systems , 2002, ArXiv.

[28]  Thierry Paul,et al.  Quantum computation and quantum information , 2007, Mathematical Structures in Computer Science.

[29]  Leonard M. Adleman,et al.  Quantum Computability , 1997, SIAM J. Comput..

[30]  Sophie Laplante,et al.  Quantum Kolmogorov complexity , 2000, Proceedings 15th Annual IEEE Conference on Computational Complexity.

[31]  P. Billingsley,et al.  Ergodic theory and information , 1966 .

[32]  Igor Bjelakovic,et al.  The Data Compression Theorem for Ergodic Quantum Information Sources , 2005, Quantum Inf. Process..

[33]  Umesh V. Vazirani,et al.  Quantum complexity theory , 1993, STOC.

[34]  Ray J. Solomonoff,et al.  A Formal Theory of Inductive Inference. Part II , 1964, Inf. Control..

[35]  Benjamin Schumacher,et al.  A new proof of the quantum noiseless coding theorem , 1994 .

[36]  Rune B. Lyngsø,et al.  Lecture Notes I , 2008 .

[37]  Jacob Ziv,et al.  Coding of sources with unknown statistics-I: Probability of encoding error , 1972, IEEE Trans. Inf. Theory.

[38]  I. Bjelakovic,et al.  Chained Typical Subspaces - a Quantum Version of Breiman's Theorem , 2003, quant-ph/0301177.

[39]  F. Hiai,et al.  The proper formula for relative entropy and its asymptotics in quantum probability , 1991 .

[40]  Ming Li,et al.  An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.

[41]  D. Petz,et al.  Stationary quantum source coding , 1999, quant-ph/9912103.

[42]  Harumichi Nishimura,et al.  Local transition functions of quantum Turing machines , 2000, RAIRO Theor. Informatics Appl..

[43]  Robert Alicki,et al.  Comparison of dynamical entropies for the noncommutative shifts , 1995 .

[44]  R. Feynman Simulating physics with computers , 1999 .

[45]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[46]  A. Brudno Entropy and the complexity of the trajectories of a dynamical system , 1978 .

[47]  Andrei N. Kolmogorov,et al.  Logical basis for information theory and probability theory , 1968, IEEE Trans. Inf. Theory.