Robust linear dimensionality reduction for hypothesis testing with application to sensor selection

This paper addresses robust linear dimensionality reduction (RLDR) for binary Gaussian hypothesis testing. The goal is to find a linear map from the high dimensional space where the data vector lives to a low dimensional space where the hypothesis test is carried out. The linear map is designed to maximize the detector performance. This translates into maximizing the Kullback-Leibler (KL) distance between the two projected distributions. In practice, the distribution parameters are estimated from training data, thus subject to uncertainty. This is modeled by allowing the distribution parameters to drift within some confidence regions. We address the case where only the mean values of the Gaussian distributions, m0 and m1, are uncertain with confidence ellipsoids defined by the corresponding covariance matrices, S0 and S1. Under this setup, we find the linear map that maximizes the KL distance for the worst case drift of the mean values. We solve the problem globally for the case of linear mapping to one dimension, reducing it to a grid search over a finite interval. Our solution shows superior performance compared to robust linear discriminant analysis techniques recently proposed in the literature. In addition, we use our RLDR solution as a building block to derive a sensor selection algorithm for robust event detection, in the context of sensor networks. Our sensor selection algorithm shows quasi-optimal performance: worst-case KL distance for suboptimal sensor selection is at most 15% smaller than worst-case KL distance for the optimal sensor selection obtained by exhaustive search.

[1]  K. Poolla,et al.  Optimal sensor density for remote estimation over Wireless Sensor Networks , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.

[2]  Edward J. Wegman,et al.  Statistical Signal Processing , 1985 .

[3]  Robert P. W. Duin,et al.  Linear dimensionality reduction via a heteroscedastic extension of LDA: the Chernoff criterion , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Feng Zhao,et al.  Information-Driven Dynamic Sensor Collaboration for Tracking Applications , 2002 .

[5]  Takeo Kanade,et al.  Multimodal oriented discriminant analysis , 2005, ICML.

[6]  Dinesh Verma,et al.  A survey of sensor selection schemes in wireless sensor networks , 2007, SPIE Defense + Commercial Sensing.

[7]  Stephen P. Boyd,et al.  Robust Fisher Discriminant Analysis , 2005, NIPS.

[8]  Keinosuke Fukunaga,et al.  Introduction to statistical pattern recognition (2nd ed.) , 1990 .

[9]  Stephen P. Boyd,et al.  Sensor Selection via Convex Optimization , 2009, IEEE Transactions on Signal Processing.

[10]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[11]  Boris Polyak Convexity of Quadratic Transformations and Its Use in Control and Optimization , 1998 .

[12]  L. Brickman ON THE FIELD OF VALUES OF A MATRIX , 1961 .

[13]  Bruno Sinopoli,et al.  A Relaxation Approach to Dynamic Sensor Selection in Large-Scale Wireless Networks , 2008, 2008 The 28th International Conference on Distributed Computing Systems Workshops.

[14]  Don H. Johnson,et al.  Symmetrizing the Kullback-Leibler Distance , 2001 .

[15]  João M. F. Xavier,et al.  Sensor selection for hypothesis testing in wireless sensor networks: a Kullback-Leibler based approach , 2009, Proceedings of the 48h IEEE Conference on Decision and Control (CDC) held jointly with 2009 28th Chinese Control Conference.

[16]  R. Ho Algebraic Topology , 2022 .

[17]  Feng Zhao,et al.  Information-driven dynamic sensor collaboration , 2002, IEEE Signal Process. Mag..

[18]  Luis Rueda,et al.  Linear dimensionality reduction by maximizing the Chernoff distance in the transformed space , 2008, Pattern Recognit..