On Formally Bounding Information Leakage by Statistical Estimation

We study the problem of giving formal bounds on the information leakage of deterministic programs, when only a black-box access to the system is provided, and little is known about the input generation mechanism. After introducing a statistical set-up and defining a formal notion of information leakage estimator, we prove that, in the absence of significant a priori information about the output distribution, no such estimator can in fact exist that does significantly better than exhaustive enumeration of the input domain. Moreover, we show that the difficult part is essentially obtaining tight upper bounds. This motivates us to consider a relaxed scenario, where the analyst is given some control over the input distribution: an estimator is introduced that, with high probability, gives lower bounds irrespective of the underlying distribution, and tight upper bounds if the input distribution induces a “close to uniform” output distribution. We then define two methods, one based on Metropolis Monte Carlo and one based on Accept-Reject, that can ideally be employed to sample from one such input distribution, and discuss a practical methodology based on them. We finally demonstrate the proposed methodology with a few experiments, including an analysis of cache side-channels in sorting algorithms.

[1]  Tom Chothia,et al.  A Statistical Test for Information Leaks Using Continuous Mutual Information , 2011, CSF.

[2]  Catuscia Palamidessi,et al.  Quantitative Notions of Leakage for One-try Attacks , 2009, MFPS.

[3]  Prakash Panangaden,et al.  On the Bayes risk in information-hiding protocols , 2008, J. Comput. Secur..

[4]  Vijay Atluri,et al.  Computer Security – ESORICS 2011 , 2011, Lecture Notes in Computer Science.

[5]  Tom Chothia,et al.  Probabilistic Point-to-Point Information Leakage , 2013, 2013 IEEE 26th Computer Security Foundations Symposium.

[6]  Geoffrey Smith,et al.  On the Foundations of Quantitative Information Flow , 2009, FoSSaCS.

[7]  Michele Boreale,et al.  Quantitative Information Flow, with a View , 2011, ESORICS.

[8]  J. Meseguer,et al.  Security Policies and Security Models , 1982, 1982 IEEE Symposium on Security and Privacy.

[9]  Andrey Rybalchenko,et al.  Automation of Quantitative Information-Flow Analysis , 2013, SFM.

[10]  Rajeev Alur,et al.  A Temporal Logic of Nested Calls and Returns , 2004, TACAS.

[11]  Ian Stark,et al.  Free-Algebra Models for the pi-Calculus , 2005, FoSSaCS.

[12]  David Sands,et al.  On Confidentiality and Algorithms , 2001, S&P 2001.

[13]  Ronitt Rubinfeld,et al.  The complexity of approximating the entropy , 2002, Proceedings 17th IEEE Annual Conference on Computational Complexity.

[14]  Jan Reineke,et al.  CacheAudit: A Tool for the Static Analysis of Cache Side Channels , 2013, TSEC.

[15]  Hirotoshi Yasuoka,et al.  On bounding problems of quantitative information flow , 2010, J. Comput. Secur..

[16]  Tom Chothia,et al.  LeakWatch: Estimating Information Leakage from Java Programs , 2014, ESORICS.

[17]  Andrey Rybalchenko,et al.  Approximation and Randomization for Quantitative Information-Flow Analysis , 2010, 2010 23rd IEEE Computer Security Foundations Symposium.

[18]  Tom Chothia,et al.  Statistical Measurement of Information Leakage , 2010, TACAS.

[19]  A. Chao,et al.  Estimating the Number of Classes via Sample Coverage , 1992 .

[20]  J. Massey Guessing and entropy , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[21]  Prakash Panangaden,et al.  Anonymity protocols as noisy channels , 2008, Inf. Comput..

[22]  Alessandro Panconesi,et al.  Concentration of Measure for the Analysis of Randomized Algorithms , 2009 .

[23]  Luca Bortolussi,et al.  Checking Individual Agent Behaviours in Markov Population Models by Fluid Approximation , 2013, SFM.

[24]  Tom Chothia,et al.  A Tool for Estimating Information Leakage , 2013, CAV.

[25]  Michele Boreale,et al.  Asymptotic information leakage under one-try attacks† , 2011, Mathematical Structures in Computer Science.