Shannon’s entropy power inequality via restricted minkowski sums

If X is an I~"-valued random variable whose distribution P x is absolutely continuous with respect to the Lebesgue measure An and f is the corresponding density, the entropy of X is defined via h(X) := f ~ f log ~d~n. One of the fundamental results of Information Theory (see, e.g., [SW]) is the Shannon's Entropy Power Inequality, which affirms that if X, Y are two such variables which are independent, then