Semicircularity, Gaussianity and Monotonicity of Entropy

The quantity ‖j(X)‖2 is called the Fisher information of X(t) and is denoted by F (X(t)). Among all random variables with a given variance, the Gaussians are the (unique) ones with the smallest Fisher information and the largest entropy. ∗As a student of the PhD-school OP-ALG-TOP-GEO the author is partially supported by the Danish Research Training Council. †Partially supported by The Danish National Research Foundation.