Privacy-aware guessing efficiency

We investigate the problem of guessing a discrete random variable Y under a privacy constraint dictated by another correlated discrete random variable X, where both guessing efficiency and privacy are assessed in terms of the probability of correct guessing. We define h(P<inf>XY</inf>,ε) as the maximum probability of correctly guessing Y given an auxiliary random variable Z, where the maximization is taken over all P<inf>Z</inf>|<inf>Y</inf> ensuring that the probability of correctly guessing X given Z does not exceed ε. We show that the map ε → h(P<inf>XY</inf>,ε) is strictly increasing, concave, and piecewise linear, which allows us to derive a closed form expression for h(P<inf>xY</inf>,ε) when X and Y are connected via a binary-input binary-output channel. For {(X<inf>i</inf>, Y<inf>i</inf>)}<sup>n</sup><inf>i=1</inf> being pairs of independent and identically distributed binary random vectors, we similarly define ẖ<inf>n</inf>(P<inf>X n Y n, ε</inf>) under the assumption that Z<sup>n</sup> is also a binary vector. Then we obtain a closed form expression for ẖ<inf>n</inf>(P<inf>X n Y n, ε</inf>) for sufficiently large, but nontrivial values of ε.

[1]  Fady Alajaji,et al.  Privacy-aware MMSE estimation , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[2]  Fady Alajaji,et al.  Notes on information-theoretic privacy , 2014, 2014 52nd Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[3]  Peter E. Latham,et al.  Mutual Information , 2006 .

[4]  H. Gebelein Das statistische Problem der Korrelation als Variations‐ und Eigenwertproblem und sein Zusammenhang mit der Ausgleichsrechnung , 1941 .

[5]  Hans S. Witsenhausen,et al.  A conditional entropy bound for a pair of discrete random variables , 1975, IEEE Trans. Inf. Theory.

[6]  Fady Alajaji,et al.  Information Extraction Under Privacy Constraints , 2015, Inf..

[7]  Muriel Médard,et al.  From the Information Bottleneck to the Privacy Funnel , 2014, 2014 IEEE Information Theory Workshop (ITW 2014).

[8]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[9]  Ken R. Duffy,et al.  Bounds on inference , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[10]  Sergio Verdú,et al.  Convexity/concavity of renyi entropy and α-mutual information , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[11]  H. Witsenhausen ON SEQUENCES OF PAIRS OF DEPENDENT RANDOM VARIABLES , 1975 .

[12]  Muriel Médard,et al.  Fundamental limits of perfect privacy , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[13]  Hirosuke Yamamoto,et al.  A source coding problem for sources with additional outputs to keep secret from the receiver or wiretappers , 1983, IEEE Trans. Inf. Theory.

[14]  Abbas El Gamal,et al.  Extended Gray-Wyner system with complementary causal side information , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[15]  Sudeep Kamath,et al.  An operational measure of information leakage , 2016, 2016 Annual Conference on Information Science and Systems (CISS).

[16]  Ali Makhdoumi,et al.  Privacy-utility tradeoff under statistical uncertainty , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[17]  Fady Alajaji,et al.  Estimation Efficiency Under Privacy Constraints , 2017, IEEE Transactions on Information Theory.