Motivated by sensor network applications, in this paper we study the problem of a decentralized network of J sensors, in which each sensor observes either all or some components of an underlying sparse signal ensemble. Sensors operate with no collaboration with each other or the fusion center. Each sensor transmits a subset of its linear measurements to the fusion center. The fusion center gathers the data sent by all sensors and reconstructs the signal. The goal is to compress data at each node efficiently for accurate reconstruction at the fusion center. Accurate reconstruction is possible only if sufficient, well-chosen measurements are provided at the fusion center. In a decentralized network, each sensor measures part of a sparse signal. We refer to the sparsity of the observed signal at each node as local sparsity. Although the original signal is sparse, there is no guarantee on local sparsity at each node. To manage decentralized reconstruction, we propose a new Bernoulli Sampling scheme. This scheme associates an independent Bernoulli trial, with parameter p, to each measurement that a sensor makes. The sensor makes a measurement, if the outcome of the associated Bernoulli trial is 1. The measurement is ignored otherwise. We apply this sampling scheme to different sparsity models, including a common signal model, a common signal with innovation model, and a partitioned signal model for the observed signal. We show that it is possible to accurately reconstruct the signal through Bernoulli sampled measurements, with no assumption on the local sparsity, if the success probability of the Bernoulli sampling exceeds a lower bound. We also show the recovery through Bernoulli sampling is robust to noisy measurements and packet loss. If the signal is of length N, with sparsity k, the lower bound we derived for the parameter p of the Bernoulli Sampling, for robust and accurate reconstruction, is O(k over N log (N over k)). This result implies that the expected number of measurements needed for stable and accurate reconstruction is O(k log (N over k)). This is the same as the result obtained for a collaborating sensor network or a distributed network with local sparsity assumption.
[1]
Richard G. Baraniuk,et al.
An Information-Theoretic Approach to Distributed Compressed Sensing ∗
,
2005
.
[2]
Richard G. Baraniuk,et al.
Distributed Compressive Sensing
,
2009,
ArXiv.
[3]
Emmanuel J. Candès,et al.
Decoding by linear programming
,
2005,
IEEE Transactions on Information Theory.
[4]
E. Candès,et al.
Stable signal recovery from incomplete and inaccurate measurements
,
2005,
math/0503066.
[5]
E.J. Candes,et al.
An Introduction To Compressive Sampling
,
2008,
IEEE Signal Processing Magazine.
[6]
R.G. Baraniuk,et al.
Distributed Compressed Sensing of Jointly Sparse Signals
,
2005,
Conference Record of the Thirty-Ninth Asilomar Conference onSignals, Systems and Computers, 2005..
[7]
R. Vershynin.
ON LARGE RANDOM ALMOST EUCLIDEAN BASES
,
2000
.
[8]
Anupam Gupta,et al.
An elementary proof of the Johnson-Lindenstrauss Lemma
,
1999
.
[9]
Michael B. Wakin,et al.
Concentration of measure for block diagonal measurement matrices
,
2010,
2010 IEEE International Conference on Acoustics, Speech and Signal Processing.
[10]
Richard G. Baraniuk,et al.
Recovery of Jointly Sparse Signals from Few Random Projections
,
2005,
NIPS.
[11]
Joel A. Tropp,et al.
Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
,
2007,
IEEE Transactions on Information Theory.
[12]
E.J. Candes.
Compressive Sampling
,
2022
.
[13]
R. DeVore,et al.
A Simple Proof of the Restricted Isometry Property for Random Matrices
,
2008
.
[14]
J. Tropp,et al.
SIGNAL RECOVERY FROM PARTIAL INFORMATION VIA ORTHOGONAL MATCHING PURSUIT
,
2005
.