A solution to the initial mean consensus problem via a continuum based Mean Field control approach

This paper presents a continuum approach to the initial mean consensus problem via Mean Field (MF) stochastic control theory. In this problem formulation: (i) each agent has simple stochastic dynamics with inputs directly controlling its state's rate of change, and (ii) each agent seeks to minimize its individual cost function involving a mean field coupling to the states of all other agents. For this dynamic game problem, a set of coupled deterministic (Hamilton-Jacobi-Bellman and Fokker-Planck-Kolmogorov) equations is derived approximating the stochastic system of agents in the continuum (i.e., as the population size N goes to infinity). In a finite population system (analogous to the individual based approach): (i) the resulting MF control strategies possess an εN-Nash equilibrium property where εN goes to zero as the population size N approaches infinity, and (ii) these MF control strategies steer each individual's state toward the initial state population mean which is reached asymptotically as time goes to infinity. Hence, the system with decentralized MF control strategies reaches mean-consensus on the initial state population mean asymptotically (as time goes to infinity).