Private matchings and allocations

We consider a private variant of the classical allocation problem: given k goods and n agents with individual, private valuation functions over bundles of goods, how can we partition the goods amongst the agents to maximize social welfare? An important special case is when each agent desires at most one good, and specifies her (private) value for each good: in this case, the problem is exactly the maximum-weight matching problem in a bipartite graph. Private matching and allocation problems have not been considered in the differential privacy literature, and for good reason: they are plainly impossible to solve under differential privacy. Informally, the allocation must match agents to their preferred goods in order to maximize social welfare, but this preference is exactly what agents wish to hide! Therefore, we consider the problem under the relaxed constraint of joint differential privacy: for any agent i, no coalition of agents excluding i should be able to learn about the valuation function of agent i. In this setting, the full allocation is no longer published---instead, each agent is told what good to get. We first show that with a small number of identical copies of each good, it is possible to efficiently and accurately solve the maximum weight matching problem while guaranteeing joint differential privacy. We then consider the more general allocation problem, when bidder valuations satisfy the gross substitutes condition. Finally, we prove that the allocation problem cannot be solved to non-trivial accuracy under joint differential privacy without requiring multiple copies of each type of good.

[1]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[2]  V. Crawford,et al.  Job Matching, Coalition Formation, and Gross Substitutes , 1982 .

[3]  Aaron Roth,et al.  Asymptotically truthful equilibrium selection in large congestion games , 2013, EC.

[4]  Aaron Roth,et al.  Privacy and mechanism design , 2013, SECO.

[5]  Aaron Roth,et al.  Differentially private combinatorial optimization , 2009, SODA '10.

[6]  Guy N. Rothblum,et al.  Boosting and Differential Privacy , 2010, 2010 IEEE 51st Annual Symposium on Foundations of Computer Science.

[7]  Guy N. Rothblum,et al.  A Multiplicative Weights Mechanism for Privacy-Preserving Data Analysis , 2010, 2010 IEEE 51st Annual Symposium on Foundations of Computer Science.

[8]  Sofya Raskhodnikova,et al.  Smooth sensitivity and sampling in private data analysis , 2007, STOC '07.

[9]  Justin Hsu,et al.  Differential privacy for the analyst via private equilibrium computation , 2012, STOC '13.

[10]  Elaine Shi,et al.  Private and Continual Release of Statistics , 2010, TSEC.

[11]  Aaron Roth,et al.  Mechanism design in large games: incentives and privacy , 2012, ITCS.

[12]  Moni Naor,et al.  The Privacy of the Analyst and the Power of the State , 2012, FOCS.

[13]  Faruk Gul,et al.  WALRASIAN EQUILIBRIUM WITH GROSS SUBSTITUTES , 1999 .

[14]  Aaron Roth,et al.  The Algorithmic Foundations of Differential Privacy , 2014, Found. Trends Theor. Comput. Sci..

[15]  Irit Dinur,et al.  Revealing information while preserving privacy , 2003, PODS.

[16]  Aaron Roth,et al.  A learning theory approach to non-interactive database privacy , 2008, STOC.

[17]  Moni Naor,et al.  Differential privacy under continual observation , 2010, STOC '10.

[18]  Ilya Mironov,et al.  Differentially private recommender systems: building privacy into the net , 2009, KDD.