Differential Entropy
暂无分享,去创建一个
We now introduce the concept of differential entropy, which is the entropy of a continuous random variable. Differential entropy is also related to the shortest description length, and is similar in many ways to the entropy of a discrete random variable. But there are some important differences, and there is need for some care in using the concept. Let X be a random variable with cumulative distribution function F(x) = Pr(X I x). If F(x) is continuous, the random variable is said to be continuous. Let fix) = F'(x) when the derivative is defined. If J " co fb> = 1, th en fl 1 x is called the probability density function for X. The set where f(x) > 0 is called the support set of X. Definition: The differential entropy h(X) of a continuous random variable X with a density fix) is defined as h(X) =-f(x) log f(x) dx , where S is the support set of the random variable. As in the discrete case, the differential entropy depends only on the probability density of the random variable, and hence the differential entropy is sometimes written as h(f) rather than h(X).