Maximizing the entropy of a sum of independent random variables

We show that the differential entropy of a sum of independent symmetric random variables supported on [-1,1] is maximal when one is uniformly distributed and the remaining are Bernoulli /spl plusmn/1.