EaSTFLy: Efficient and secure ternary federated learning

Abstract Privacy-preserving machine learning allows multiple parties to perform distributed data analytics while guaranteeing individual privacy. In this area, researchers have proposed many schemes that combine machine learning with privacy-preserving technologies. But these works have shortcomings in terms of efficiency. Meanwhile, federated learning has received widespread attention due to its ability to update parameters without collecting users’ raw data, but this method is short in communications and privacy. Recently, ternary gradients federated learning(TernGrad) has been proposed to reduce the communications, but it is still to various security and privacy threats. In this paper, firstly, we analyze the privacy leakages of TernGrad. Then, we present our solution-EaSTFLy to solve the privacy issue. More concretely, in EaSTFLy, we combine TernGrad with secret sharing and homomorphic encryption to design our privacy-preserving protocols against semi-honest adversary. In addition, we optimize our protocols via SIMD. Compared to prior works on floating-point gradients, our protocols are more efficient in communication and computation overheads, and the accuracy is as high as the plaintext ternary federated learning. To our best knowledge, this is the first research combining ternary federated learning with privacy-preserving technologies. Finally, we evaluate our experiments to show improvements.

[1]  Yao Lu,et al.  Oblivious Neural Network Predictions via MiniONN Transformations , 2017, IACR Cryptol. ePrint Arch..

[2]  Vitaly Shmatikov,et al.  Privacy-preserving deep learning , 2015, 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[3]  Aseem Rastogi,et al.  EzPC: Programmable and Efficient Secure Two-Party Computation for Machine Learning , 2019, 2019 IEEE European Symposium on Security and Privacy (EuroS&P).

[4]  Dan Boneh,et al.  Prio: Private, Robust, and Scalable Computation of Aggregate Statistics , 2017, NSDI.

[5]  Cong Xu,et al.  TernGrad: Ternary Gradients to Reduce Communication in Distributed Deep Learning , 2017, NIPS.

[6]  T. Giordano,et al.  The Health Insurance Portability and Accountability Act of 1996 (HIPAA) privacy rule: implications for clinical research. , 2006, Annual review of medicine.

[7]  Ananda Theertha Suresh,et al.  Distributed Mean Estimation with Limited Communication , 2016, ICML.

[8]  Payman Mohassel,et al.  SecureML: A System for Scalable Privacy-Preserving Machine Learning , 2017, 2017 IEEE Symposium on Security and Privacy (SP).

[9]  Adi Shamir,et al.  How to share a secret , 1979, CACM.

[10]  Shun-ichi Amari,et al.  Backpropagation and stochastic gradient descent method , 1993, Neurocomputing.

[11]  Dong Yu,et al.  1-bit stochastic gradient descent and its application to data-parallel distributed training of speech DNNs , 2014, INTERSPEECH.

[12]  Alexander J. Smola,et al.  Scaling Distributed Machine Learning with the Parameter Server , 2014, OSDI.

[13]  Yehuda Lindell,et al.  Introduction to Modern Cryptography , 2004 .

[14]  Peter Rindal,et al.  ABY3: A Mixed Protocol Framework for Machine Learning , 2018, IACR Cryptol. ePrint Arch..

[15]  It Governance Privacy Team EU General Data Protection Regulation (GDPR): An Implementation and Compliance Guide - Second edition , 2017 .

[16]  Shiho Moriai,et al.  Privacy-Preserving Deep Learning via Additively Homomorphic Encryption , 2018, IEEE Transactions on Information Forensics and Security.

[17]  Pascal Paillier,et al.  Public-Key Cryptosystems Based on Composite Degree Residuosity Classes , 1999, EUROCRYPT.

[18]  Anantha Chandrakasan,et al.  Gazelle: A Low Latency Framework for Secure Neural Network Inference , 2018, IACR Cryptol. ePrint Arch..

[19]  Michael Naehrig,et al.  CryptoNets: applying neural networks to encrypted data with high throughput and accuracy , 2016, ICML 2016.

[20]  Dan Alistarh,et al.  QSGD: Communication-Optimal Stochastic Gradient Descent, with Applications to Training Neural Networks , 2016, 1610.02132.

[21]  Sameer Wagh,et al.  SecureNN: 3-Party Secure Computation for Neural Network Training , 2019, Proc. Priv. Enhancing Technol..

[22]  Sarvar Patel,et al.  Practical Secure Aggregation for Privacy-Preserving Machine Learning , 2017, IACR Cryptol. ePrint Arch..

[23]  Pritish Narayanan,et al.  Deep Learning with Limited Numerical Precision , 2015, ICML.