Sensor Selection in High-Dimensional Gaussian Trees with Nuisances

We consider the sensor selection problem on multivariate Gaussian distributions where only a subset of latent variables is of inferential interest. For pairs of vertices connected by a unique path in the graph, we show that there exist decompositions of nonlocal mutual information into local information measures that can be computed efficiently from the output of message passing algorithms. We integrate these decompositions into a computationally efficient greedy selector where the computational expense of quantification can be distributed across nodes in the network. Experimental results demonstrate the comparative efficiency of our algorithms for sensor selection in high-dimensional distributions. We additionally derive an online-computable performance bound based on augmentations of the relevant latent variable set that, when such a valid augmentation exists, is applicable for any distribution with nuisances.

[1]  R. Corsini,et al.  CERN — The European Organization for Nuclear Research , 2019, B-Model Gromov-Witten Theory.

[2]  Venkat Chandrasekaran,et al.  Feedback message passing for inference in gaussian graphical models , 2010, 2010 IEEE International Symposium on Information Theory.

[3]  Alfred O. Hero,et al.  An Information-Based Approach to Sensor Management in Large Dynamic Networks , 2007, Proceedings of the IEEE.

[4]  Venkat Chandrasekaran,et al.  Complexity of Inference in Graphical Models , 2008, UAI.

[5]  Maurice Queyranne,et al.  An Exact Algorithm for Maximum Entropy Sampling , 1995, Oper. Res..

[6]  Dmitry M. Malioutov,et al.  Walk-Sums and Belief Propagation in Gaussian Graphical Models , 2006, J. Mach. Learn. Res..

[7]  Brendan J. Frey,et al.  Factor graphs and the sum-product algorithm , 2001, IEEE Trans. Inf. Theory.

[8]  John W. Fisher,et al.  Performance Guarantees for Information Theoretic Active Inference , 2007, AISTATS.

[9]  Han-Lim Choi,et al.  Continuous trajectory planning of mobile sensors for informative forecasting , 2010, Autom..

[10]  Vincent Y. F. Tan,et al.  Learning Latent Tree Graphical Models , 2010, J. Mach. Learn. Res..

[11]  William T. Freeman,et al.  Correctness of Belief Propagation in Gaussian Graphical Models of Arbitrary Topology , 1999, Neural Computation.

[12]  Ben Segal,et al.  CERN EUROPEAN ORGANIZATION FOR NUCLEAR RESEARCH , 2001 .

[13]  Andreas Krause,et al.  Near-optimal Nonmyopic Value of Information in Graphical Models , 2005, UAI.

[14]  Andreas Krause,et al.  Optimal Value of Information in Graphical Models , 2009, J. Artif. Intell. Res..

[15]  Nir Friedman,et al.  Probabilistic Graphical Models - Principles and Techniques , 2009 .

[16]  M. L. Fisher,et al.  An analysis of approximations for maximizing submodular set functions—I , 1978, Math. Program..

[17]  Tandy J. Warnow,et al.  A Few Logs Suffice to Build (almost) All Trees: Part II , 1999, Theor. Comput. Sci..

[18]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .