Associating Data in System of Systems Using Measures of Information

Many modern day collaborative systems/system of systems rely heavily on the sharing of information in order to improve performance, manage resources, and maximize overall capability. These types of systems are characterized by many decentralized nodes that can either all be identical or partitioned into a finite set of specialized types. When information is shared within any system of systems, the overall performance hinges on its ability to correctly associate the information received. The primary hypothesis evaluated by any system of systems after the receipt of new information is to determine whether this information belongs to a previously observed entity, or not. When this hypothesis is false, the new information belongs to a new entity, which includes both real and false entities. To evaluate this hypothesis, and to determine the optimal assignments to make at each time step, a data association discriminator or scoring function that performs like a distance function between two probability distribut...

[1]  Thia Kirubarajan,et al.  Estimation with Applications to Tracking and Navigation: Theory, Algorithms and Software , 2001 .

[2]  Jianhua Lin,et al.  Divergence measures based on the Shannon entropy , 1991, IEEE Trans. Inf. Theory.

[3]  Sergios Theodoridis,et al.  Pattern Recognition , 1998, IEEE Trans. Neural Networks.

[4]  Igor Vajda,et al.  On Divergences and Informations in Statistics and Information Theory , 2006, IEEE Transactions on Information Theory.

[5]  P. Mahalanobis On the generalized distance in statistics , 1936 .

[6]  Feng Zhao,et al.  Information-Driven Dynamic Sensor Collaboration for Tracking Applications , 2002 .

[7]  I. Csiszár Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems , 1991 .

[8]  Nobuaki Minematsu,et al.  A Study on Invariance of $f$-Divergence and Its Application to Speech Recognition , 2010, IEEE Transactions on Signal Processing.

[9]  B. Silverman Density estimation for statistics and data analysis , 1986 .

[10]  S. M. Ali,et al.  A General Class of Coefficients of Divergence of One Distribution from Another , 1966 .

[11]  A. Neath Testing Statistical Hypotheses (3rd ed.). E. L. Lehmann and Joseph P. Romano , 2006 .

[12]  Athanasios Papoulis,et al.  Probability, Random Variables and Stochastic Processes , 1965 .

[13]  Wallace Alvin Wilson,et al.  On Semi-Metric Spaces , 1931 .

[14]  John W. Fisher,et al.  Hypothesis Testing over Factorizations for Data Association , 2003, IPSN.

[15]  Yuejin Tan,et al.  An information theoretic approach based Kullback-Leibler discrimination for multiple target tracking , 2009, 2009 International Conference on Information and Automation.

[16]  Imre Csiszár,et al.  Information theory and statistics , 2013 .

[17]  John W. Fisher,et al.  Nonparametric hypothesis tests for statistical dependency , 2004, IEEE Transactions on Signal Processing.

[18]  A. Rényi On Measures of Entropy and Information , 1961 .

[19]  Sigeru Omatu,et al.  An alternative expression of the mutual information for Gaussian processes (Corresp.) , 1976, IEEE Trans. Inf. Theory.

[20]  Feng Zhao,et al.  Information-driven dynamic sensor collaboration , 2002, IEEE Signal Process. Mag..

[21]  Inderjit S. Dhillon,et al.  Clustering with Bregman Divergences , 2005, J. Mach. Learn. Res..

[22]  Richard E. Blahut,et al.  Principles and practice of information theory , 1987 .

[23]  H. Chernoff A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations , 1952 .

[24]  Imre Csiszár,et al.  Information Theory and Statistics: A Tutorial , 2004, Found. Trends Commun. Inf. Theory.

[25]  Anja Vogler,et al.  An Introduction to Multivariate Statistical Analysis , 2004 .

[26]  I. Vajda,et al.  Convex Statistical Distances , 2018, Statistical Inference for Engineers and Data Scientists.

[27]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[28]  L. Bregman The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming , 1967 .

[29]  I. Csiszar Maximum entropy and related methods , 1994, Proceedings of 1994 Workshop on Information Theory and Statistics.

[30]  Hugh F. Durrant-Whyte,et al.  Information-theoretic approach to decentralized control of multiple autonomous flight vehicles , 2000, SPIE Optics East.

[31]  Vasant Shankar Huzurbazar,et al.  Exact forms of some invariants for distributions admitting sufficient statistics , 1955 .

[32]  Geoffrey J. McLachlan,et al.  Finite Mixture Models , 2019, Annual Review of Statistics and Its Application.

[33]  Solomon Kullback,et al.  Information Theory and Statistics , 1970, The Mathematical Gazette.

[34]  Dominik Endres,et al.  A new metric for probability distributions , 2003, IEEE Transactions on Information Theory.

[35]  T. Kailath The Divergence and Bhattacharyya Distance Measures in Signal Selection , 1967 .

[36]  Martin E. Liggins,et al.  Information theoretics for improved tracking and fusion performance , 2003, SPIE Defense + Commercial Sensing.

[37]  Andries P. Engelbrecht,et al.  Computational Intelligence: An Introduction , 2002 .

[38]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .

[39]  S. Kullback,et al.  Information Theory and Statistics , 1959 .

[40]  R. A. Leibler,et al.  On Information and Sufficiency , 1951 .

[41]  Andrei N. Kolmogorov,et al.  On the Shannon theory of information transmission in the case of continuous signals , 1956, IRE Trans. Inf. Theory.

[42]  Roy Sterritt,et al.  Swarms and Swarm Intelligence , 2007, Computer.

[43]  I. Miller Probability, Random Variables, and Stochastic Processes , 1966 .

[44]  C. D. Kemp,et al.  Density Estimation for Statistics and Data Analysis , 1987 .

[45]  Hugh Durrant-Whyte,et al.  Data Fusion and Sensor Management: A Decentralized Information-Theoretic Approach , 1995 .

[46]  William H. Press,et al.  Numerical Recipes 3rd Edition: The Art of Scientific Computing , 2007 .

[47]  Shun-ichi Amari,et al.  $\alpha$ -Divergence Is Unique, Belonging to Both $f$-Divergence and Bregman Divergence Classes , 2009, IEEE Transactions on Information Theory.

[48]  H. Jeffreys An invariant form for the prior probability in estimation problems , 1946, Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences.

[49]  Flemming Topsøe,et al.  Some inequalities for information divergence and related measures of discrimination , 2000, IEEE Trans. Inf. Theory.

[50]  M. Basseville Distance measures for signal processing and pattern recognition , 1989 .

[51]  Samuel S. Blackman,et al.  Multiple-Target Tracking with Radar Applications , 1986 .