Measuring Independence between Statistical Randomness Tests by Mutual Information

The analysis of independence between statistical randomness tests has had great attention in the literature recently. Dependency detection between statistical randomness tests allows one to discriminate statistical randomness tests that measure similar characteristics, and thus minimize the amount of statistical randomness tests that need to be used. In this work, a method for detecting statistical dependency by using mutual information is proposed. The main advantage of using mutual information is its ability to detect nonlinear correlations, which cannot be detected by the linear correlation coefficient used in previous work. This method analyzes the correlation between the battery tests of the National Institute of Standards and Technology, used as a standard in the evaluation of randomness. The results of the experiments show the existence of statistical dependencies between the tests that have not been previously detected.

[1]  Emil Simion,et al.  A view on NIST randomness tests (In)Dependence , 2017, 2017 9th International Conference on Electronics, Computers and Artificial Intelligence (ECAI).

[2]  Kenneth J. Berry,et al.  A Chronicle of Permutation Statistical Methods , 2014 .

[3]  David F. Barrero,et al.  Evolutionary generation and degeneration of randomness to assess the indepedence of the Ent test battery , 2017, 2017 IEEE Congress on Evolutionary Computation (CEC).

[4]  Pierre L'Ecuyer,et al.  TestU01: A C library for empirical testing of random number generators , 2006, TOMS.

[5]  Korbinian Strimmer,et al.  Entropy Inference and the James-Stein Estimator, with Application to Nonlinear Gene Association Networks , 2008, J. Mach. Learn. Res..

[6]  Gavin Brown,et al.  Efficient feature selection using shrinkage estimators , 2019, Machine Learning.

[7]  Boris Ryabko Time-Adaptive Statistical Test for Random Number Generators , 2020, Entropy.

[8]  Tarald O. Kvålseth,et al.  On Normalized Mutual Information: Measure Derivations and Properties , 2017, Entropy.

[9]  Inés Samengo,et al.  Estimating the Mutual Information between Two Discrete, Asymmetric Variables with Limited Samples , 2019, Entropy.

[10]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[11]  Hua Chen,et al.  A General Method to Evaluate the Correlation of Randomness Tests , 2013, WISA.

[12]  Elaine B. Barker,et al.  A Statistical Test Suite for Random and Pseudorandom Number Generators for Cryptographic Applications , 2000 .

[13]  David Thomas,et al.  The Art in Computer Programming , 2001 .

[14]  Fatih Sulak,et al.  Mutual correlation of NIST statistical randomness tests and comparison of their sensitivities on transformed sequences , 2017, Turkish J. Electr. Eng. Comput. Sci..

[15]  V. A. Monarev,et al.  Using Information Theory Approach to Randomness Testing , 2003, IACR Cryptol. ePrint Arch..

[16]  France T́elécom,et al.  Optimal Bin Number for Equal Frequency Discretizations in Supervized Learning , 2007 .

[17]  Lawrence E. Bassham,et al.  Randomness Testing of the Advanced Encryption Standard Finalist Candidates , 2000 .

[18]  P. Good Permutation, Parametric, and Bootstrap Tests of Hypotheses , 2005 .

[19]  H. Demirhan,et al.  Statistical Testing of Cryptographic Randomness , 2016 .

[20]  P. Tichavský,et al.  ON THE ESTIMATION OF MUTUAL INFORMATION , 2009 .

[21]  Haydar Demirhan,et al.  A simulation study on the accuracy of cryptographic randomness tests , 2017, Simul..

[22]  Sergio Verdú,et al.  Empirical Estimation of Information Measures: A Literature Guide , 2019, Entropy.

[23]  Emil Simion,et al.  A Systematic Approach of NIST Statistical Tests Dependencies , 2019 .

[24]  Serdar Boztas,et al.  On Independence and Sensitivity of Statistical Randomness Tests , 2008, SETA.