On the Jensen-Shannon Divergence and Variational Distance

We study the distance measures between two probability distributions via two different distance metrics, a new metric induced from Jensen-Shannon divergence, and the well known L/sub 1/ metric. We show that several important results and constructions in computational complexity under the L/sub 1/ metric carry over to the new metric, such as Yao's next-bit predictor, the existence of extractors, the leftover hash lemma, and the construction of expander graph based extractor. Finally, we show that the useful parity lemma in studying pseudorandomness does not hold in the new metric.

[1]  Noam Nisan,et al.  Randomness is Linear in Space , 1996, J. Comput. Syst. Sci..

[2]  Noga Alon,et al.  The Probabilistic Method, Second Edition , 2004 .

[3]  Flemming Topsøe,et al.  Some inequalities for information divergence and related measures of discrimination , 2000, IEEE Trans. Inf. Theory.

[4]  Raymond W. Yeung,et al.  A First Course in Information Theory , 2002 .

[5]  Jianhua Lin,et al.  Divergence measures based on the Shannon entropy , 1991, IEEE Trans. Inf. Theory.

[6]  Russell Impagliazzo,et al.  How to recycle random bits , 1989, 30th Annual Symposium on Foundations of Computer Science.

[7]  Andrew Chi-Chih Yao,et al.  Theory and application of trapdoor functions , 1982, 23rd Annual Symposium on Foundations of Computer Science (sfcs 1982).

[8]  Rajeev Motwani,et al.  Randomized algorithms , 1996, CSUR.

[9]  Dominik Endres,et al.  A new metric for probability distributions , 2003, IEEE Transactions on Information Theory.

[10]  Noga Alon,et al.  The Probabilistic Method , 2015, Fundamentals of Ramsey Theory.

[11]  Umesh V. Vazirani,et al.  Strong communication complexity or generating quasi-random sequences from two communicating semi-random sources , 1987, Comb..

[12]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[13]  Kenneth P. Bogart,et al.  Introductory Combinatorics , 1977 .

[14]  Ziv Bar-Yossef,et al.  Sampling lower bounds via information theory , 2003, STOC '03.