Performance analysis of the mutual information function for nonlinear and linear signal processing
暂无分享,去创建一个
Nonlinear signal processing is now well established both in theory and applications. Nevertheless, very few tools are available for the analysis of nonlinear systems. We introduce the mutual information function (MIF) as a nonlinear correlation function and describe the practicalities of estimating it from data. Even if an estimator is consistent, it is of great interest to check what the bias and variance are with a finite sample. We discuss these questions, as well as the computational efficiency, for two estimators. Both algorithms are of complexity Nlog/sub 2/N, where N is the sample length, but they use different methods to find the histogram for the estimation of the mutual information. An efficient implementation makes it possible to apply the algorithm on real time signal processing problems where the linear correlation analysis breaks down. Examples involving linear and nonlinear channels are discussed.
[1] Hans-Peter Bernhard,et al. A tight upper bound on the gain of linear and nonlinear predictors for stationary stochastic processes , 1998, IEEE Trans. Signal Process..
[2] Barry Simon,et al. Topics in Ergodic Theory , 1994 .
[3] Andrew M. Fraser,et al. Information and entropy in strange attractors , 1989, IEEE Trans. Inf. Theory.
[4] Georges A. Darbellay,et al. Predictability: An Information-Theoretic Perspective , 1998 .