Distributed Asynchronous Averaging for Community Detection

Consider the following probabilistic process on a graph G = (V,E). After initially labeling each vertex v by a real number, say randomly chosen in ±1, we repeatedly pick a random edge and replace the values of its endpoints by their average. Suppose the process is run on a graph exhibiting a community structure, such as two expanders joined by a sparse cut: is there a phase of the process, in which its state reflects the underlying community structure? Moreover, can nodes somewhat learn that structure via a simple, local procedure? The above questions arise since the expected action of the one-edge-at-a-time averaging corresponds to the repeated application of the transition matrix of a lazy walk on G, and it is known that, for certain graph classes, the resulting evolution of the state allows to uncover the underlying community structure. We answer the first question in the affirmative for a class of regular clustered graphs that includes the regular stochastic block model. Addressing the question above (in this restricted class as well) requires studying the concentration of the averaging process around its expectation. In turn, this calls for a deeper understanding of concentration properties of the product of certain random matrices around its expectation. These properties (albeit in different flavors) emerge both in the regime in which the sparsity of the cut is o(1/ log |V |) (with constant expansion within each community), and when the sparsity is constant. The analysis in the latter regime is the technically hardest part of this work, because we have to establish concentration results up to inverse polynomial errors. As for the second question, since nodes do not share a common clock, it is not immediate to translate the above results into distributed clustering protocols. To this purpose, we show that concentration holds over a long time window and most nodes are able to select a local time within this window. This results in the first asynchronous distributed algorithms that require logarithmic or polylogarithmic work per node (depending on the sparsity of the cut) and that approximately recover community structure.

[1]  David Kempe,et al.  A decentralized algorithm for spectral analysis , 2008, J. Comput. Syst. Sci..

[2]  Cristopher Moore,et al.  Asymptotic analysis of the stochastic block model for modular networks and its algorithmic applications , 2011, Physical review. E, Statistical, nonlinear, and soft matter physics.

[3]  Luca Trevisan,et al.  Find Your Place: Simple Distributed Algorithms for Community Detection , 2015, SODA.

[4]  Ioana Dumitriu,et al.  Recovery and Rigidity in a Regular Stochastic Block Model , 2016, SODA.

[5]  Charles Bordenave,et al.  A new proof of Friedman's second eigenvalue theorem and its extension to random lifts , 2015, Annales scientifiques de l'École normale supérieure.

[6]  Laurent Massoulié,et al.  Community detection thresholds and the weak Ramanujan property , 2013, STOC.

[7]  Stephen P. Boyd,et al.  Randomized gossip algorithms , 2006, IEEE Transactions on Information Theory.

[8]  Noga Alon,et al.  The Probabilistic Method , 2015, Fundamentals of Ramsey Theory.

[9]  Elchanan Mossel,et al.  A Proof of the Block Model Threshold Conjecture , 2013, Comb..

[10]  Devdatt P. Dubhashi,et al.  Concentration of Measure for the Analysis of Randomized Algorithms: Contents , 2009 .

[11]  J. R. Ipsen Products of independent Gaussian random matrices , 2015, 1510.06128.

[12]  Angelo Vulpiani,et al.  Products of Random Matrices , 1993 .

[13]  Elchanan Mossel,et al.  Reconstruction and estimation in the planted partition model , 2012, Probability Theory and Related Fields.