GELU-Net: A Globally Encrypted, Locally Unencrypted Deep Neural Network for Privacy-Preserved Learning

Privacy is a fundamental challenge for a variety of smart applications that depend on data aggregation and collaborative learning across different entities. In this paper, we propose a novel privacy-preserved architecture where clients can collaboratively train a deep model while preserving the privacy of each client’s data. Our main strategy is to carefully partition a deep neural network to two non-colluding parties. One party performs linear computations on encrypted data utilizing a less complex homomorphic cryptosystem, while the other executes non-polynomial computations in plaintext but in a privacy-preserved manner. We analyze security and compare the communication and computation complexity with the existing approaches. Our extensive experiments on different datasets demonstrate not only stable training without accuracy loss, but also 14 to 35 times speedup compared to the state-ofthe-art system.

[1]  Peter Tino,et al.  IEEE Transactions on Neural Networks , 2009 .

[2]  C. D. Meyer,et al.  Generalized inverses of linear transformations , 1979 .

[3]  Axthonv G. Oettinger,et al.  IEEE Transactions on Information Theory , 1998 .

[4]  IEEE Transactions on Parallel and Distributed Systems, Vol. 13 , 2002 .

[5]  Oded Goldreich Foundations of Cryptography: Volume 1 , 2006 .

[6]  K. Fernow New York , 1896, American Potato Journal.

[7]  Fan Zhang,et al.  Stealing Machine Learning Models via Prediction APIs , 2016, USENIX Security Symposium.

[8]  Ananthram Swami,et al.  Practical Black-Box Attacks against Machine Learning , 2016, AsiaCCS.

[9]  Michael Mitzenmacher,et al.  Proceedings of the forty-first annual ACM symposium on Theory of computing , 2009, STOC 2009.

[10]  Giuseppe Ateniese,et al.  Deep Models Under the GAN: Information Leakage from Collaborative Deep Learning , 2017, CCS.

[11]  Jason Yosinski,et al.  Deep neural networks are easily fooled: High confidence predictions for unrecognizable images , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[12]  Shucheng Yu,et al.  Privacy Preserving Back-Propagation Neural Network Learning Made Practical with Cloud Computing , 2014, IEEE Transactions on Parallel and Distributed Systems.

[13]  Jean Arlat,et al.  IEEE Transactions on Dependable and Secure Computing , 2006 .

[14]  Bhavani Thuraisingham,et al.  Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security , 2017, CCS.

[15]  Pascal Paillier,et al.  Public-Key Cryptosystems Based on Composite Degree Residuosity Classes , 1999, EUROCRYPT.

[16]  Dan Boneh,et al.  Evaluating 2-DNF Formulas on Ciphertexts , 2005, TCC.

[17]  Roi Livni,et al.  On the Computational Efficiency of Training Neural Networks , 2014, NIPS.

[18]  R. Lathe Phd by thesis , 1988, Nature.

[19]  Frederik Vercauteren,et al.  Fully Homomorphic Encryption with Relatively Small Key and Ciphertext Sizes , 2010, Public Key Cryptography.

[20]  Sheng Zhong,et al.  Efficient and Privacy-Preserving Min and $k$ th Min Computations in Mobile Sensing Systems , 2017, IEEE Transactions on Dependable and Secure Computing.

[21]  Michael Naehrig,et al.  CryptoNets: applying neural networks to encrypted data with high throughput and accuracy , 2016, ICML 2016.