Universal Scanning and Sequential Decision Making for Multidimensional Data

We investigate several problems in scanning of multidimensional data arrays, such as universal scanning and prediction ("scandiction", for short), and scandiction of noisy data arrays. These problems arise in several aspects of image and video processing, such as predictive coding, filtering and denoising. In predictive coding of images, for example, an image is compressed by coding the prediction error sequence resulting from scandicting it. Thus, it is natural to ask what is the optimal method to scan and predict a given image, what is the resulting minimum prediction loss, and if there exist specific scandiction schemes which are universal in some sense. More specifically, we investigate the following problems: first, given a random field, we examine whether there exists a scandiction scheme which is independent of the field's distribution, yet asymptotically achieves the same performance as if this distribution was known. This question is answered in the affirmative for the set of all spatially stationary random fields and under mild conditions on the loss function. We then discuss the scenario where a non-optimal scanning order is used, yet accompanied by an optimal predictor, and derive a bound on the excess loss compared to optimal scandiction. Finally, we examine the scenario where the random field is corrupted by noise, but the scanning and prediction (or filtering) scheme is judged with respect to the underlying noiseless field

[1]  Wael A. Hashlamoun,et al.  A unified framework for evaluating bounds on the Bayesian cost , 1998, Signal Process..

[2]  Martin E. Hellman,et al.  Probability of error, equivocation, and the Chernoff bound , 1970, IEEE Trans. Inf. Theory.

[3]  Amir Dembo,et al.  Source coding, large deviations, and approximate pattern matching , 2001, IEEE Trans. Inf. Theory.

[4]  Shie Mannor,et al.  On Universal Compression of Multi-Dimensional Data Arrays Using Self-Similar Curves , 2000 .

[5]  Tsachy Weissman,et al.  Twofold universal prediction schemes for achieving the finite-state predictability of a noisy individual binary sequence , 2001, IEEE Trans. Inf. Theory.

[6]  Craig Gotsman,et al.  Universal Rendering Sequences for Transparent Vertex Caching of Progressive Meshes , 2002, Comput. Graph. Forum.

[7]  Tsachy Weissman,et al.  Universal Scanning and Sequential Decision Making for Multidimensional Data , 2006, 2006 IEEE International Symposium on Information Theory.

[8]  David Haussler,et al.  How to use expert advice , 1993, STOC.

[9]  Hans Föllmer,et al.  On entropy and information gain in random fields , 1973 .

[10]  Christos Faloutsos,et al.  Analysis of the Clustering Properties of the Hilbert Space-Filling Curve , 2001, IEEE Trans. Knowl. Data Eng..

[11]  Nasir D. Memon,et al.  An analysis of some common scanning techniques for lossless image coding , 1997, Conference Record of the Thirty-First Asilomar Conference on Signals, Systems and Computers (Cat. No.97CB36136).

[12]  David Haussler,et al.  Sequential Prediction of Individual Sequences Under General Loss Functions , 1998, IEEE Trans. Inf. Theory.

[13]  Ewa Skubalska-Rafajlowicz,et al.  Pattern recognition algorithms based on space-filling curves and orthogonal expansions , 2001, IEEE Trans. Inf. Theory.

[14]  Daniel Cohen-Or,et al.  Context‐based Space Filling Curves , 2000, Comput. Graph. Forum.

[15]  Guillermo Sapiro,et al.  LOCO-I: a low complexity, context-based, lossless image compression algorithm , 1996, Proceedings of Data Compression Conference - DCC '96.

[16]  Abraham Lempel,et al.  Compression of two-dimensional data , 1986, IEEE Trans. Inf. Theory.

[17]  Ioannis Kontoyiannis,et al.  Pattern matching and lossy data compression on random fields , 2003, IEEE Trans. Inf. Theory.

[18]  Neri Merhav,et al.  Universal prediction of individual sequences , 1992, IEEE Trans. Inf. Theory.

[19]  Lee D. Davisson,et al.  Universal noiseless coding , 1973, IEEE Trans. Inf. Theory.

[20]  H. Helson,et al.  Prediction theory and Fourier Series in several variables , 1958 .

[21]  Tsachy Weissman,et al.  On limited-delay lossy coding and filtering of individual sequences , 2002, IEEE Trans. Inf. Theory.

[22]  Tamás Linder,et al.  Efficient adaptive algorithms and minimax bounds for zero-delay lossy source coding , 2004, IEEE Transactions on Signal Processing.

[23]  Neri Merhav,et al.  Hidden Markov processes , 2002, IEEE Trans. Inf. Theory.

[24]  Tsachy Weissman,et al.  On the optimality of symbol-by-symbol filtering and denoising , 2004, IEEE Transactions on Information Theory.

[25]  Manfred K. Warmuth,et al.  The Weighted Majority Algorithm , 1994, Inf. Comput..

[26]  Claude-Henri Lamarque,et al.  Image analysis using space-filling curves and 1D wavelet bases , 1996, Pattern Recognit..

[27]  Neri Merhav,et al.  Universal schemes for sequential decision from individual data sequences , 1993, IEEE Trans. Inf. Theory.

[28]  R. Gray,et al.  Asymptotically Mean Stationary Measures , 1980 .

[29]  Jacob Ziv,et al.  On universal quantization , 1985, IEEE Trans. Inf. Theory.

[30]  László Györfi,et al.  A simple randomized algorithm for sequential prediction of ergodic time series , 1999, IEEE Trans. Inf. Theory.

[31]  Neri Merhav,et al.  Universal Prediction , 1998, IEEE Trans. Inf. Theory.

[32]  D. Ornstein,et al.  The Shannon-McMillan-Breiman theorem for a class of amenable groups , 1983 .

[33]  Neri Merhav,et al.  Universal Filtering Via Prediction , 2007, IEEE Transactions on Information Theory.

[34]  Abraham Lempel,et al.  Compression of individual sequences via variable-rate coding , 1978, IEEE Trans. Inf. Theory.

[35]  Athanasios Papoulis,et al.  Probability, Random Variables and Stochastic Processes , 1965 .

[36]  I. Miller Probability, Random Variables, and Stochastic Processes , 1966 .

[37]  Neri Merhav,et al.  On sequential strategies for loss functions with memory , 2002, IEEE Trans. Inf. Theory.

[38]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[39]  Nelson M. Blachman,et al.  The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.

[40]  Desh Ranjan,et al.  Space-Filling Curves and Their Use in the Design of Geometric Data Structures , 1997, Theor. Comput. Sci..

[41]  Toby Berger,et al.  Information measures for discrete random fields , 1998 .

[42]  Tsachy Weissman,et al.  Universal prediction of random binary sequences in a noisy environment , 2004 .

[43]  E. Lindenstrauss Pointwise theorems for amenable groups , 1999 .

[44]  G. Lugosi,et al.  On Prediction of Individual Sequences , 1998 .

[45]  Tamás Linder,et al.  A zero-delay sequential scheme for lossy coding of individual sequences , 2001, IEEE Trans. Inf. Theory.

[46]  Vladimir Vovk,et al.  Aggregating strategies , 1990, COLT '90.

[47]  Konstantinos Konstantinides,et al.  Occam filters for stochastic sources with application to digital images , 1998, IEEE Trans. Signal Process..

[48]  Tsachy Weissman,et al.  Universal prediction of individual binary sequences in the presence of noise , 2001, IEEE Trans. Inf. Theory.

[49]  G. Lugosi,et al.  On Prediction of Individual Sequences , 1998 .

[50]  Tsachy Weissman,et al.  Scanning and prediction in multidimensional data arrays , 2002, IEEE Trans. Inf. Theory.

[51]  Nasir D. Memon,et al.  Lossless Image Compression with a Codebook of Block Scans , 1995, IEEE J. Sel. Areas Commun..

[52]  Jianhua Lin,et al.  Divergence measures based on the Shannon entropy , 1991, IEEE Trans. Inf. Theory.

[53]  H. Sagan Space-filling curves , 1994 .