Scalable robust hypothesis tests using graphical models

Traditional binary hypothesis testing relies on the precise knowledge of the probability density of an observed random vector conditioned on each hypothesis. However, for many applications, these densities can only be approximated due to limited training data or dynamic changes affecting the observed signal. A classical approach to handle such scenarios of imprecise knowledge is via minimax robust hypothesis testing (RHT), where a test is designed to minimize the worst case performance for all models in the vicinity of the approximated imprecise density. Despite the promise of RHT for robust classification problems, its applications have remained rather limited because RHT in its native form does not scale gracefully with the dimension of the observed random vector. In this paper, we use approximations via probabilistic graphical models, in particular block-tree graphs, to enable computationally tractable algorithms for realizing RHT on high-dimensional data. We quantify the reductions in computational complexity. Experimental results on simulated data and a target recognition problem show minimal loss over a true RHT.

[1]  Bernard C. Levy,et al.  Robust Hypothesis Testing With a Relative Entropy Tolerance , 2007, IEEE Transactions on Information Theory.

[2]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  P. J. Huber A Robust Version of the Probability Ratio Test , 1965 .

[4]  Michael I. Jordan,et al.  Graphical Models, Exponential Families, and Variational Inference , 2008, Found. Trends Mach. Learn..

[5]  José M. F. Moura,et al.  Graphical Models as Block-Tree Graphs , 2010, ArXiv.

[6]  Vincent Y. F. Tan,et al.  Learning Graphical Models for Hypothesis Testing and Classification , 2010, IEEE Transactions on Signal Processing.

[7]  Michael I. Jordan,et al.  A Robust Minimax Approach to Classification , 2003, J. Mach. Learn. Res..

[8]  C. N. Liu,et al.  Approximating discrete probability distributions with dependence trees , 1968, IEEE Trans. Inf. Theory.

[9]  H. Vincent Poor,et al.  Neyman-Pearson Detection of Gauss-Markov Signals in Noise: Closed-Form Error Exponent and Properties , 2005, ISIT.

[10]  Bernard C. Levy,et al.  Principles of Signal Detection and Parameter Estimation , 2008 .

[11]  Ananthram Swami,et al.  Detection of Gauss–Markov Random Fields With Nearest-Neighbor Dependency , 2007, IEEE Transactions on Information Theory.

[12]  Shie Mannor,et al.  Robustness and Regularization of Support Vector Machines , 2008, J. Mach. Learn. Res..

[13]  S.A. Kassam,et al.  Robust techniques for signal processing: A survey , 1985, Proceedings of the IEEE.