Abstract Distributed average consensus refers to computing average of inputs held by multiple agents communicating with each other over peer-to-peer network. Cooperation amongst agents is imperative for any distributed average consensus protocol as each agent has to share its input with other agents, which are usually the adjacent(neighboring) agents. That being said, privacy issues could discourage some agents from participating in such protocols. This paper proposes a novel distributed privacy mechanism that preserves privacy of the collection of honest agents’ inputs as long as the colluding semi-honest agents do not form a vertex cut. The proposed privacy mechanism does not alter the average of agents’ inputs, hence it does not provide privacy against what is already lost by knowing the average of the inputs. It poses minimal additional computation and communication costs, requires no alteration of the distributed consensus protocol and promises a highly scalable practical solution for privacy in distributed average consensus. The privacy achieved is quantified using Kullback-Leibler divergence (KL-divergence) and limitations are discussed analytically for two cases; case i) inputs are continuous random variables, and case ii) inputs are discrete random variables.
[1]
Yehuda Lindell,et al.
Secure Multiparty Computation for Privacy-Preserving Data Mining
,
2009,
IACR Cryptol. ePrint Arch..
[2]
Aaron Roth,et al.
The Algorithmic Foundations of Differential Privacy
,
2014,
Found. Trends Theor. Comput. Sci..
[3]
J. Cortés,et al.
Differentially Private Average Consensus with Optimal Noise Selection
,
2015
.
[4]
R. A. Leibler,et al.
On Information and Sufficiency
,
1951
.
[5]
I. Gutman,et al.
Generalized inverse of the Laplacian matrix and some applications
,
2004
.