On Conditional α-Information and its Application to Side-Channel Analysis

A conditional version of Sibson’s $\alpha$-information is defined using a simple closed-form “log-expectation” expression, which satisfies important properties such as consistency, uniform expansion, and data processing inequalities. This definition is compared to previous ones, which in contrast do not satisfy all of these properties. Based on our proposal and on a generalized Fano inequality, we extend the case $\alpha=1$ of previous works to obtain sharp universal upper bounds for the probability of success of any type side-channel attack, particularly when $\alpha=2$.

[1]  Amos Lapidoth,et al.  Two Measures of Dependence , 2016, 2016 IEEE International Conference on the Science of Electrical Engineering (ICSEE).

[2]  Claude E. Shannon,et al.  The lattice theory of information , 1953, Trans. IRE Prof. Group Inf. Theory.

[3]  Serge Fehr,et al.  On the Conditional Rényi Entropy , 2014, IEEE Transactions on Information Theory.

[4]  M. Gastpar,et al.  On conditional Sibson's α-Mutual Information , 2021, ArXiv.

[5]  R. Sibson Information radius , 1969 .

[6]  Peter Harremoës,et al.  Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.

[7]  Sylvain Guilley,et al.  An Information-Theoretic Model for Side-Channel Attacks in Embedded Hardware , 2019, 2019 IEEE International Symposium on Information Theory (ISIT).

[8]  Carles Padró,et al.  Information Theoretic Security , 2013, Lecture Notes in Computer Science.

[9]  S. Verdú,et al.  Arimoto channel coding converse and Rényi divergence , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[10]  Masahito Hayashi,et al.  Operational Interpretation of Rényi Information Measures via Composite Hypothesis Testing Against Product and Markov Distributions , 2015, IEEE Transactions on Information Theory.

[11]  Moti Yung,et al.  A Unified Framework for the Analysis of Side-Channel Key Recovery Attacks (extended version) , 2009, IACR Cryptol. ePrint Arch..

[12]  Sylvain Guilley,et al.  Best Information is Most Successful Mutual Information and Success Rate in Side-Channel Analysis , 2019, IACR Trans. Cryptogr. Hardw. Embed. Syst..

[13]  Peter E. Latham,et al.  Mutual Information , 2006 .

[14]  Sylvain Guilley,et al.  Good is Not Good Enough: Deriving Optimal Distinguishers from Communication Theory , 2014, IACR Cryptol. ePrint Arch..

[15]  Olivier Rioul,et al.  A Primer on Alpha-Information Theory with Application to Leakage in Secrecy Systems , 2021, GSI.