Markov chains and mixing times

For our purposes, a Markov chain is a (finite or countable) collection of states S and transition probabilities pij, where i, j ∈ S. We write P = [pij] for the matrix of transition probabilities. Elements of S can be interpreted as various possible states of whatever system we are interested in studying, and pij represents the probability that the system is in state j at time n+ 1, if it is state i at time n. We will think of a Markov chain as a stochastic process with state space SN, representing all sequences X0, X1, X2, . . . , where Xn is the state of the system at time n. The characterisation just given of the transition probabilities can be expressed as