Distributed sensor network data fusion using image processing

In this paper we discuss the analogy between spatial distributed sensor network analysis and image processing. The analogy comes from the fact that in high density sensor networks the output of sensors is correlated both spatially and temporally. This means that the output of a sensor is correlated with the outputs of its neighbours. This characteristic is very similar to the pixels' output (intensity) in video signals. The video signal consists of multiple correlated frames (correlation in time), and each frame consists of large number of pixels, and usually there is high correlation between pixels (spatial correlation). By defining this relation one can use the well-known image processing techniques for sensor data compression, fusion, and analysis. As an example we show how to use the quadtree image decomposition for sensor spatial decomposition.

[1]  Deborah Estrin,et al.  ASCENT: adaptive self-configuring sensor networks topologies , 2004, IEEE Transactions on Mobile Computing.

[2]  Deborah Estrin,et al.  Geography-informed energy conservation for Ad Hoc routing , 2001, MobiCom '01.

[3]  Andreas Savvides,et al.  TASC: topology adaptive spatial clustering for sensor networks , 2005, IEEE International Conference on Mobile Adhoc and Sensor Systems Conference, 2005..

[4]  Ramesh Govindan,et al.  Wireless sensor networks , 2003, Comput. Networks.

[5]  S. Sitharama Iyengar,et al.  Foundations of data fusion for automation , 2003 .

[6]  James Llinas,et al.  Handbook of Multisensor Data Fusion , 2001 .

[7]  Ossama Younis,et al.  HEED: a hybrid, energy-efficient, distributed clustering approach for ad hoc sensor networks , 2004, IEEE Transactions on Mobile Computing.

[8]  H.M. Wechsler,et al.  Digital image processing, 2nd ed. , 1981, Proceedings of the IEEE.

[9]  Paul Wintz,et al.  Digital image processing (2nd ed.) , 1987 .