Distributed EM algorithms for density estimation and clustering in sensor networks

The paper considers the problem of density estimation and clustering in distributed sensor networks. It is assumed that each node in the network senses an environment that can be described as a mixture of some elementary conditions. The measurements are thus statistically modeled with a mixture of Gaussians, where each Gaussian component corresponds to one of the elementary conditions. The paper presents a distributed expectation-maximization (EM) algorithm for estimating the Gaussian components, which are common to the environment and sensor network as a whole, as well as the mixing probabilities that may vary from node to node. The algorithm produces an estimate (in terms of a Gaussian mixture approximation) of the density of the sensor data without requiring the data to be transmitted to and processed at a central location. Alternatively, the algorithm can be viewed as a distributed processing strategy for clustering the sensor data into components corresponding to predominant environmental features sensed by the network. The convergence of the distributed EM algorithm is investigated, and simulations demonstrate the potential of this approach to sensor network data analysis.

[1]  Michael I. Jordan,et al.  On Convergence Properties of the EM Algorithm for Gaussian Mixtures , 1996, Neural Computation.

[2]  Bo Thiesson,et al.  Accelerating EM for Large Databases , 2001, Machine Learning.

[3]  New York Dover,et al.  ON THE CONVERGENCE PROPERTIES OF THE EM ALGORITHM , 1983 .

[4]  Jinwen Ma,et al.  Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures , 2000, Neural Computation.

[5]  Panganamala Ramana Kumar,et al.  Towards an information theory of large networks: an achievable rate region , 2003, IEEE Trans. Inf. Theory.

[6]  Sergio D. Servetto On the Feasibility of Large-Scale Wireless Sensor Networks , 2002 .

[7]  Jeffrey A. Fessler,et al.  Convergence in Norm for Alternating Expectation-Maximization (EM) Type Algorithms , 1995 .

[8]  Panganamala Ramana Kumar,et al.  RHEINISCH-WESTFÄLISCHE TECHNISCHE HOCHSCHULE AACHEN , 2001 .

[9]  Kannan Ramchandran,et al.  Distributed compression in a dense microsensor network , 2002, IEEE Signal Process. Mag..

[10]  J. Besag On the Statistical Analysis of Dirty Pictures , 1986 .

[11]  Shin Ishii,et al.  On-line EM Algorithm for the Normalized Gaussian Network , 2000, Neural Computation.

[12]  Geoffrey E. Hinton,et al.  A View of the Em Algorithm that Justifies Incremental, Sparse, and other Variants , 1998, Learning in Graphical Models.

[13]  Anil K. Jain,et al.  Unsupervised Learning of Finite Mixture Models , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  G. McLachlan,et al.  The EM algorithm and extensions , 1996 .

[15]  A. Gunawardana,et al.  The information geometry of em variants for speech and image processing , 2001 .