On a Relationship between the Correct Probability of Estimation from Correlated Data and Mutual Information

Let $X$, $Y$ be two correlated discrete random variables. We consider an estimation of $X$ from encoded data $\varphi(Y)$ of $Y$ by some encoder function $\varphi(Y)$. We derive an inequality describing a relation of the correct probability of estimation and the mutual information between $X$ and $\varphi(Y)$. This inequality may be useful for the secure analysis of crypto system when we use the success probability of estimating secret data as a security criterion. It also provides an intuitive meaning of the secrecy exponent in the strong secrecy criterion.

[1]  Imre Csiszár,et al.  Broadcast channels with confidential messages , 1978, IEEE Trans. Inf. Theory.

[2]  Masahito Hayashi,et al.  Exponential Decreasing Rate of Leaked Information in Universal Random Privacy Amplification , 2009, IEEE Transactions on Information Theory.

[3]  Claude E. Shannon,et al.  Communication theory of secrecy systems , 1949, Bell Syst. Tech. J..

[4]  Masahide Sasaki,et al.  Reliability and Secrecy Functions of the Wiretap Channel Under Cost Constraint , 2013, IEEE Transactions on Information Theory.

[5]  A. D. Wyner,et al.  The wire-tap channel , 1975, The Bell System Technical Journal.