Detecting Random Walks on Graphs With Heterogeneous Sensors

We consider the problem of detecting a random walk on a graph, based on observations of the graph nodes. When visited by the walk, each node of the graph observes a signal of elevated mean, which we assume can be different across different nodes. Outside of the path of the walk, and also in its absence, nodes measure only noise. Assuming the Neyman-Pearson setting, our goal then is to characterize detection performance by computing the error exponent for the probability of a miss, under a constraint on the probability of false alarm. Since the exact computation of the error exponent is known to be difficult, equivalent to the computation of the Lyapunov exponent, we approximate its value by finding a tractable lower bound. The bound reveals an interesting detectability condition: the walk is detectable whenever the entropy of the walk is smaller than one half of the expected signal-to-noise ratio. We derive the bound by extending the notion of Markov types to Gauss-Markov types. These are sequences of the state-observation pairs with a given number of node-to-node transition counts and the same average signal values across nodes, computed from the measurements made during the times the random walk visited each node’s respective location. The lower bound has an intuitive interpretation: among all Gauss-Markov types that are asymptotically feasible in the absence of the walk, the bound finds the most typical one under the presence of the walk. Finally, we show by a sequence of judicious problem reformulations that computing the bound reduces to solving a convex optimization problem, which is a result of in its interest own right.

[1]  W. Rudin Principles of mathematical analysis , 1964 .

[2]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[3]  Aarnout Brombacher,et al.  Probability... , 2009, Qual. Reliab. Eng. Int..

[4]  R. R. Bahadur Some Limit Theorems in Statistics , 1987 .

[5]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[6]  Yue M. Lu,et al.  Optimal Detection of Random Walks on Graphs: Performance Analysis via Statistical Physics , 2015, ArXiv.

[7]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[8]  Po-Ning Chen General formulas for the Neyman-Pearson type-II error exponent subject to fixed and exponential type-I error bounds , 1996, IEEE Trans. Inf. Theory.

[9]  S. Kakutani On Equivalence of Infinite Product Measures , 1948 .

[10]  A. Rukhin,et al.  Adaptive tests for stochastic processes in the ergodic case , 1993 .

[11]  Martin Jaggi,et al.  Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization , 2013, ICML.

[12]  G. Casella,et al.  Springer Texts in Statistics , 2016 .

[13]  Alfred O. Hero,et al.  Near-optimal signal detection for finite-state Markov signals with application to magnetic resonance force microscopy , 2006, IEEE Transactions on Signal Processing.

[14]  Philip Wolfe,et al.  An algorithm for quadratic programming , 1956 .

[15]  D K Smith,et al.  Numerical Optimization , 2001, J. Oper. Res. Soc..

[16]  John N. Tsitsiklis,et al.  The Lyapunov exponent and joint spectral radius of pairs of matrices are hard—when not impossible—to compute and to approximate , 1997, Math. Control. Signals Syst..

[17]  Giuseppe Longo,et al.  The error exponent for the noiseless encoding of finite ergodic Markov sources , 1981, IEEE Trans. Inf. Theory.

[18]  Emmanuel J. Candes,et al.  Detecting highly oscillatory signals by chirplet path pursuit , 2006, gr-qc/0604017.

[19]  Karol Vasek On the error exponent for ergodic Markov source , 1980, Kybernetika.

[20]  P. Whittle,et al.  Some Distribution and Moment Formulae for the Markov Chain , 1955 .

[21]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[22]  H. Chernoff LARGE-SAMPLE THEORY: PARAMETRIC CASE' , 1956 .

[23]  M. V. Burnashev On a Statistical Problem Related to Random Walks , 1982 .

[24]  João M. F. Xavier,et al.  Consensus and Products of Random Stochastic Matrices: Exact Rate for Convergence in Probability , 2012, IEEE Transactions on Signal Processing.

[25]  B. Lautrup,et al.  Products of random matrices. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[26]  R Š Lipcer,et al.  ON THE QUESTION OF ABSOLUTE CONTINUITY AND SINGULARITY OF PROBABILITY MEASURES , 1977 .

[27]  Alfred O. Hero,et al.  Detection Of a Random Walk Signal in the Regime of Low Signal to Noise Ratio and Long Observation Time , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.

[28]  Calyampudi R. Rao Theory of Statistical Inference , 2008 .

[29]  Thomas M. Cover,et al.  Elements of information theory (2. ed.) , 2006 .

[30]  Tarik Taleb,et al.  Machine-type communications: current status and future perspectives toward 5G systems , 2015, IEEE Communications Magazine.

[31]  Jamie S. Evans,et al.  Error Exponents for Neyman–Pearson Detection of Markov Chains in Noise , 2007, IEEE Transactions on Signal Processing.

[32]  H. Vincent Poor,et al.  Neyman-Pearson Detection of Gauss-Markov Signals in Noise: Closed-Form Error Exponent and Properties , 2005, ISIT.

[33]  A. V. Skorohod,et al.  Integration in Hilbert Space , 1974 .

[34]  S. Varadhan,et al.  Large deviations , 2019, Graduate Studies in Mathematics.

[35]  Neri Merhav,et al.  Hidden Markov processes , 2002, IEEE Trans. Inf. Theory.

[36]  Persi Diaconis,et al.  Iterated Random Functions , 1999, SIAM Rev..

[37]  H. Vincent Poor,et al.  Neyman-pearson detection of gauss-Markov signals in noise: closed-form error exponentand properties , 2005, IEEE Transactions on Information Theory.

[38]  I. Vajda Distances and discrimination rates for stochastic processes , 1990 .

[39]  S. Natarajan,et al.  Large deviations, hypotheses testing, and source coding for finite Markov chains , 1985, IEEE Trans. Inf. Theory.

[40]  L. B. Boza Asymptotically Optimal Tests for Finite Markov Chains , 1971 .