Three Approaches to the Definition of the Notion of Amount of Information

Suppose the variable x can assume values belonging to a finite set X which consists of N elements. We then say that the “entropy” of the variable x is equal to $$H(X) = {\log _2}N$$