The entropy of irreducible stochastic matrix measures its mixing properties. We show that this quantity is subadditive and strongly subadditive. 1Instytut Matematyki Uniwersytet Jagielloński Reymonta 4, 30-059 Kraków, Poland mail: slomczyn@im.uj.edu.pl 1 1 Preliminaries Let N ∈ N. We denote the simplex of probability vectors (finite probability distributions) by ∆N = { p ∈ R : pi ≥ 0 for i = 1, . . . , N and ∑N i=1pi = 1 } and its interior by ∆◦N = { p ∈ R : pi > 0 for i = 1, . . . , N and ∑N i=1pi = 1 } . The extreme points of the simplex will be denoted by e = (0, ..., 1 k , ...0) (k = 1, ..., N). We define the function η : R → R by the formula η (x) = { −x ln x for x 6= 0 0 for x = 0 . Then η is continuous and strictly concave. Moreover η|[0,1] is nonnegative. The Boltzmann-Shannon entropy h : ∆N → R is defined by the formula h (p) = ∑N i=1η (pi) for p ∈ ∆N . The quantity h (p) can be interpreted as a measure of uncertainty of p and it changes from 0 for e (k = 1, ..., N) to ln N for the uniform vector (1/N, . . . , 1/N). It is easy to check that h is continuous and concave. We call a nonnegative matrix P = (pij)i,j=1,...,N stochastic iff ∑N j=1pij = 1 for each i = 1, . . . , N . We say that a stochastic matrix P = (pij)i,j=1,...,N is irreducible iff for every i, j = 1, . . . , N there exists k ∈ N such that
[1]
N. Čencov.
On basic concepts of mathematical statistics
,
1980
.
[2]
Robert J. Plemmons,et al.
Nonnegative Matrices in the Mathematical Sciences
,
1979,
Classics in Applied Mathematics.
[3]
P. Antonelli.
The Information Dynamics of the Diffusion Processes of Population Genetics
,
1997
.
[4]
L. Campbell.
An extended Čencov characterization of the information metric
,
1986
.
[5]
Wojciech Słomczyński,et al.
Random unistochastic matrices
,
2001,
Journal of Physics A: Mathematical and General.
[6]
Michael C. Mackey,et al.
Chaos, Fractals, and Noise
,
1994
.
[7]
M. Mackey,et al.
Chaos, Fractals, and Noise: Stochastic Aspects of Dynamics
,
1998
.