Estimation of the joint probability of multisensory signals

Abstract This paper presents a novel method for estimation of the joint probability of multisensory signals by introducing dimension-reduction mapping functions based on the principle of maximum entropy. A maximum mutual information criterion is derived for selecting the desired mapping functions. An algorithm is further presented for linear transformations of Gaussian random vectors. Experimental results are shown to demonstrate the performance of the proposed method.

[1]  T. W. Anderson,et al.  An Introduction to Multivariate Statistical Analysis , 1959 .

[2]  R. Gray Entropy and Information Theory , 1990, Springer New York.

[3]  E. Jaynes Information Theory and Statistical Mechanics , 1957 .

[4]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[5]  R. A. Leibler,et al.  On Information and Sufficiency , 1951 .

[6]  S. P. Luttrell,et al.  The Use of Bayesian and Entropic Methods in Neural Network Theory , 1989 .

[7]  Thomas S. Huang,et al.  Exploiting the dependencies in information fusion , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[8]  James J. Clark,et al.  Data Fusion for Sensory Information Processing Systems , 1990 .

[9]  Thomas S. Huang,et al.  A new approach to integrate audio and visual features of speech , 2000, 2000 IEEE International Conference on Multimedia and Expo. ICME2000. Proceedings. Latest Advances in the Fast Changing World of Multimedia (Cat. No.00TH8532).