Fundamental Limitations in Sequential Prediction and Recursive Algorithms: Lp Bounds via an Entropic Analysis

In this paper, we obtain fundamental ${\mathcal{L}_p}$ bounds in sequential prediction and recursive algorithms via an entropic analysis. Both classes of problems are examined by investigating the underlying entropic relationships of the data and/or noises involved, and the derived lower bounds may all be quantified in a conditional entropy characterization. We also study the conditions to achieve the generic bounds from an innovations’ viewpoint.

[1]  P. P. Vaidyanathan,et al.  The Theory of Linear Prediction , 2008, Synthesis Lectures on Signal Processing.

[2]  P. Graefe Linear stochastic systems , 1966 .

[3]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[4]  G. Picci,et al.  Linear Stochastic Systems: A Geometric Approach to Modeling, Estimation and Identification , 2016 .

[5]  Jie Chen,et al.  An Integral Characterization of Optimal Error Covariance by Kalman Filtering , 2018, 2018 Annual American Control Conference (ACC).

[6]  Quanyan Zhu,et al.  L G ] 3 D ec 2 01 9 Fundamental Limitations in Sequential Prediction and Recursive Algorithms : L p Bounds via an Entropic Analysis , 2019 .

[7]  Karl Henrik Johansson,et al.  A Frequency-Domain Characterization of Optimal Error Covariance for the Kalman-Bucy Filter , 2018, 2018 IEEE Conference on Decision and Control (CDC).

[8]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[9]  S. Dolinar,et al.  Maximum-entropy probability distributions under Lp-norm constraints , 1991 .

[10]  J. Makhoul,et al.  Linear prediction: A tutorial review , 1975, Proceedings of the IEEE.

[11]  Thomas Kailath,et al.  A view of three decades of linear filtering theory , 1974, IEEE Trans. Inf. Theory.

[12]  Naftali Tishby,et al.  Opening the Black Box of Deep Neural Networks via Information , 2017, ArXiv.

[13]  L. Goddard Information Theory , 1962, Nature.

[14]  Tryphon T. Georgiou,et al.  The Role of the Time-Arrow in Mean-Square Estimation of Stochastic Processes , 2015, IEEE Control Systems Letters.

[15]  Quanyan Zhu,et al.  Generic Variance Bounds on Estimation and Prediction Errors in Time Series Analysis: An Entropy Perspective , 2019, 2019 IEEE Information Theory Workshop (ITW).

[16]  Naftali Tishby,et al.  Deep learning and the information bottleneck principle , 2015, 2015 IEEE Information Theory Workshop (ITW).

[17]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[18]  Aarnout Brombacher,et al.  Probability... , 2009, Qual. Reliab. Eng. Int..

[19]  R. Gray Entropy and Information Theory , 1990, Springer New York.

[20]  Jie Chen,et al.  Fundamental error bounds in state estimation: An information-theoretic analysis , 2017, 2017 IEEE 56th Annual Conference on Decision and Control (CDC).

[21]  Mohsen Pourahmadi Foundations of Time Series Analysis and Prediction Theory , 2001 .

[22]  Neil Genzlinger A. and Q , 2006 .

[23]  W. Marsden I and J , 2012 .

[24]  Quanyan Zhu,et al.  Generic Bounds On The Maximum Deviations In Sequential Prediction: An Information-Theoretic Analysis , 2019, 2019 IEEE 29th International Workshop on Machine Learning for Signal Processing (MLSP).