Distributed Deep Learning under Differential Privacy with the Teacher-Student Paradigm

The goal of this work in progress is to address distributed deep learning under differential privacy using the teacher-student paradigm. In the setting, there are a number of distributed entities and one aggregator. Each distributed entity leverages deep learning to train a teacher network on sensitive and labeled training data. The knowledge of the teacher networks is transferred to the student network at the aggregator in a privacy-preserving manner that protects the sensitive data. This transfer results from training non-sensitive and unlabeled data. We also apply secure multi-party computation to securely combining the outputs of local machine learning, in order to update a global model.