Semicircularity, Gaussianity and Monotonicity of Entropy
暂无分享,去创建一个
The quantity ‖j(X)‖2 is called the Fisher information of X(t) and is denoted by F (X(t)). Among all random variables with a given variance, the Gaussians are the (unique) ones with the smallest Fisher information and the largest entropy. ∗As a student of the PhD-school OP-ALG-TOP-GEO the author is partially supported by the Danish Research Training Council. †Partially supported by The Danish National Research Foundation.
[1] A. J. Stam. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..
[2] D. Voiculescu. The analogues of entropy and of Fisher's information measure in free probability theory, I , 1993 .
[3] O. Johnson. Free Random Variables , 2004 .
[4] K. Ball,et al. Solution of Shannon's problem on the monotonicity of entropy , 2004 .
[5] D. Shlyakhtenko. A free analogue of Shannon's problem on monotonicity of entropy , 2005 .