A Tunable Measure for Information Leakage

A tunable measure for information leakage called maximal a-leakage is introduced. This measure quantifies the maximal gain of an adversary in refining a tilted version of its prior belief of any (potentially random) function of a dataset conditioning on a disclosed dataset. The choice of $\alpha$ determines the specific adversarial action ranging from refining a belief for $\alpha=1$ to guessing the best posterior for $\alpha=\infty$, and for these extremal values this measure simplifies to mutual information (MI) and maximal leakage (MaxL), respectively. For all other $\alpha$ this measure is shown to be the Arimoto channel capacity. Several properties of this measure are proven including: (i) quasi-convexity in the mapping between the original and disclosed datasets; (ii) data processing inequalities; and (iii) a composition property. A full version of this paper is in [1].

[1]  Sudeep Kamath,et al.  An operational measure of information leakage , 2016, 2016 Annual Conference on Information Science and Systems (CISS).

[2]  V. Tan,et al.  Hypothesis testing under maximal leakage privacy constraints , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[3]  Flávio du Pin Calmon,et al.  Privacy against statistical inference , 2012, 2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[4]  Muriel Médard,et al.  On information-theoretic metrics for symmetric-key encryption and privacy , 2014, 2014 52nd Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[5]  Thomas M. Cover,et al.  Elements of information theory (2. ed.) , 2006 .

[6]  R. Sibson Information radius , 1969 .

[7]  Claude E. Shannon,et al.  Communication theory of secrecy systems , 1949, Bell Syst. Tech. J..

[8]  Neri Merhav,et al.  Universal Prediction , 1998, IEEE Trans. Inf. Theory.

[9]  Sergio Verdú,et al.  Convexity/concavity of renyi entropy and α-mutual information , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[10]  H. Vincent Poor,et al.  Utility-Privacy Tradeoffs in Databases: An Information-Theoretic Approach , 2011, IEEE Transactions on Information Forensics and Security.

[11]  Peter E. Latham,et al.  Mutual Information , 2006 .

[12]  G. Crooks On Measures of Entropy and Information , 2015 .

[13]  Richard D. Wesel,et al.  Multiterminal source coding with an entropy-based distortion measure , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.