The entropies of the sum and the difference of two IID random variables are not too different

Consider the entropy increase h(Y + Y′) - h(Y) of the sum of two continuous i.i.d. random variables Y, Y′, and the corresponding entropy increase h(Y - Y′) - h(Y) of their difference. We show that the ratio between these two quantities always lies between 1/2 and 2. This complements a recent result of Lapidoth and Pete, showing that the difference h(Y + Y′) - h(Y - Y′) may be arbitrarily large. Corresponding results are discussed for the discrete entropy, and connections are drawn with exciting recent mathematical work in the area of additive combinatorics.

[1]  Mokshay M. Madiman,et al.  Entropy and set cardinality inequalities for partition‐determined functions , 2008, Random Struct. Algorithms.

[2]  S. Artstein,et al.  Entropy Methods , .

[3]  Mokshay Madiman,et al.  On the entropy of sums , 2008, 2008 IEEE Information Theory Workshop.

[4]  A. Lapidoth,et al.  On the entropy of the sum and of the difference of independent random variables , 2008, 2008 IEEE 25th Convention of Electrical and Electronics Engineers in Israel.

[5]  Terence Tao,et al.  Sumset and Inverse Sumset Theory for Shannon Entropy , 2009, Combinatorics, Probability and Computing.