Markov Chains: Basic Definitions
暂无分享,去创建一个
Heuristically, a discrete-time stochastic process has the Markov property if the past and future are independent given the present. In this introductory chapter, we give the formal definition of a Markov chain and of the main objects related to this type of stochastic process and establish basic results. In particular, we will introduce in Section 1.2 the essential notion of a Markov kernel, which gives the distribution of the next state given the current state. In Section 1.3, we will restrict attention to time-homogeneous Markov chains and establish that a fundamental consequence of the Markov property is that the entire distribution of a Markov chain is characterized by the distribution of its initial state and a Markov kernel. In Section 1.4, we will introduce the notion of invariant measures, which play a key role in the study of the long-term behavior of a Markov chain. Finally, in Sections 1.5 and 1.6, which can be skipped on a first reading, we will introduce the notion of reversibility, which is very convenient and is satisfied by many Markov chains, and some further properties of kernels seen as operators and certain spaces of functions.