Privacy-Preserving Machine Learning

The popularization of cloud computing and machine learning algorithms facilitates ranges of complex analytic services, such as medical or financial assessments. This allows a computationally-limited client to get predictions or classification results by paying for the analytic services. These services often involve sensitive data which should be kept private. Ideally, we hope for privacy-preserving machine learning services. The clients can learn the results of the model from a service provider without revealing their inputs and the results. Meanwhile, the trained model is kept confidential from the clients with as minimal leakage as possible. This keynote focuses on how cryptography can enable privacy-preserving machine learning services, in particular, decision tree evaluation.

[1]  Marc Joye,et al.  Private yet Efficient Decision Tree Evaluation , 2018, DBSec.

[2]  Thijs Veugen,et al.  Improving the DGK comparison protocol , 2012, 2012 IEEE International Workshop on Information Forensics and Security (WIFS).

[3]  Shafi Goldwasser,et al.  Machine Learning Classification over Encrypted Data , 2015, NDSS.

[4]  Ming Li,et al.  A tale of two clouds: Computing on data encrypted under multiple keys , 2014, 2014 IEEE Conference on Communications and Network Security.

[5]  Sherman S. M. Chow,et al.  Privacy-Preserving Decision Trees Evaluation via Linear Functions , 2017, ESORICS.

[6]  Lakshminarayanan Subramanian,et al.  Two-Party Computation Model for Privacy-Preserving Queries over Distributed Databases , 2009, NDSS.

[7]  Ivan Damgård,et al.  A correction to 'efficient and secure comparison for on-line auctions' , 2009, Int. J. Appl. Cryptogr..

[8]  Stefan Katzenbeisser,et al.  Private Evaluation of Decision Trees using Sublinear Cost , 2019, Proc. Priv. Enhancing Technol..

[9]  Michael Naehrig,et al.  Privately Evaluating Decision Trees and Random Forests , 2016, IACR Cryptol. ePrint Arch..