Information and distortion in reduced-order filter design

The relation and practical relevance of information theory to the filtering problem has long been an open question. The design, evaluation, and comparison of (suboptimal) reduced-order filters by information methods is considered. First, the differences and similarities between the information theory problem and the filtering problem are delineated. Then, based on these considerations, a formulation that {\em realistically} imbeds the reduced-order filter problem in an information-theoretic framework is presented. This formulation includes a "constrained" version of the rate-distortion function. The Shannon lower bound is used both to derive formulas for (achievable) rose lower bounds for suboptimal filters and to prove that for thc reduced-order filter problem the given formulation specifies a useful relation between information and distortion in filtering. Theorems addressed to reduced-order filter design, evaluation, and comparison based on information are given. A two step design procedure is outlined which results in a decoupling of thc search in filter parameter space, and hence in computational savings.

[1]  Edwin B. Stear,et al.  Entropy analysis of estimating systems , 1970, IEEE Trans. Inf. Theory.

[2]  Thomas J. Goblick,et al.  Theoretical limitations on the transmission of data from analog sources , 1965, IEEE Trans. Inf. Theory.

[3]  T. Kailath The innovations approach to detection and estimation theory , 1970 .

[4]  Jacob Ziv,et al.  Lower and upper bounds on the optimal filtering error of certain diffusion processes , 1972, IEEE Trans. Inf. Theory.

[5]  R. E. Kalman,et al.  New Results in Linear Filtering and Prediction Theory , 1961 .

[6]  C. Hutchinson,et al.  Minimum variance reduced state filters , 1972, IEEE Conference on Decision and Control.

[7]  Heinrich Meyr,et al.  Complete statistical description of the phase-error process generated by correlative tracking systems , 1977, IEEE Trans. Inf. Theory.

[8]  R. Bucy,et al.  Filtering for stochastic processes with applications to guidance , 1968 .

[9]  Jack K. Wolf,et al.  Transmission of noisy information to a noisy receiver with minimum distortion , 1970, IEEE Trans. Inf. Theory.

[10]  Arthur Gelb,et al.  Applied Optimal Estimation , 1974 .

[11]  Thomas Kailath,et al.  A further note on a general likelihood formula for random signals in Gaussian noise , 1970, IEEE Trans. Inf. Theory.

[12]  R. Gallager Information Theory and Reliable Communication , 1968 .

[13]  Toby Berger,et al.  Information rates of stochastically driven dynamic systems (Corresp.) , 1971, IEEE Trans. Inf. Theory.

[14]  Henry L. Weidemann Entropy Analysis of Feedback Control Systems1 1The research for this paper was supported in part by funds from the United States Air Force Office of Scientific Research under AFOSR Grant 699-67. , 1969 .

[15]  G. David Forney,et al.  Convolutional codes I: Algebraic structure , 1970, IEEE Trans. Inf. Theory.

[16]  T. Duncan ON THE CALCULATION OF MUTUAL INFORMATION , 1970 .

[17]  Boris Tsybakov,et al.  Information transmission with additional noise , 1962, IRE Trans. Inf. Theory.