Differentially Private Federated Learning in Multi-Cell Networks

Federated learning (FL) is a distributed learning method where multiple users train and upload their local models or gradients to an edge server for artificial intelligence (AI) model training. However, the local model uploading causes information leakage on local data. Differential privacy (DP) is a random mechanism that adds uncertainty to protect privacy for dataset. In this paper, we study a multi-cell FL network where each cell is a FL system. Each user adds artificial noise in its uploaded local gradient, and the multi-cell interference is exploited to enhance the DP levels. The studied problem is formulated as a mean square error (MSE) minimization problem subject to the DP and power constraints, by controlling the transmission power on local gradient and artificial noise of each user. Our results show that multi-cell interference is beneficial to DP.