Stability of hybrid dynamic systems containing singularly perturbed random processes

To meet the challenge of handling dynamic systems in which continuous dynamics and discrete events coexist, we develop stability analysis of such systems governed by ordinary differential equations and stochastic differential equations with regime switching modulated by a Markov chain involving a small parameter. The small parameter is used to reflect either the in- herent two-time scales in the original system or the different rates of regime switching among a large number of states of the discrete events. The smaller the parameter is, the more rapid switching the system will experience. To reduce the complexity, one attempts to effectively "replace" the actual systems by a limit system with simpler structure. To ensure the validity of such a replacement in a longtime horizon, it is crucial that the original system is stable. The fast regime changes and the large state space of the Markov chain make the stability analysis difficult. Under suitable conditions, using the limit dynamic systems and the perturbed Lyapunov func- tion methods, we show that if the limit systems are stable, then so are the original systems. This justifies the replacement of a complicated original system by its limit from a longtime behavior point of view. Index Terms—Hybrid system, Lyapunov method, Markov chain, singular perturbation, stability, switching diffusion.

[1]  J. Hale,et al.  Ordinary Differential Equations , 2019, Fundamentals of Numerical Mathematics for Physicists and Engineers.

[2]  R. Khasminskii Stochastic Stability of Differential Equations , 1980 .

[3]  Carlos S. Kubrusly,et al.  Stochastic approximation algorithms and applications , 1973, CDC 1973.

[4]  R. H. Liu,et al.  Recursive Algorithms for Stock Liquidation: A Stochastic Optimization Approach , 2002, SIAM J. Optim..

[5]  Gang George Yin,et al.  On nearly optimal controls of hybrid LQG problems , 1999, IEEE Trans. Autom. Control..

[6]  Alexander Graham,et al.  Kronecker Products and Matrix Calculus: With Applications , 1981 .

[7]  W. Grassman Approximation and Weak Convergence Methods for Random Processes with Applications to Stochastic Systems Theory (Harold J. Kushner) , 1986 .

[8]  H. Chizeck,et al.  Controllability, stabilizability, and continuous-time Markovian jump linear quadratic control , 1990 .

[9]  V. G. Gaitsgori,et al.  Theory of Suboptimal Decisions , 1988 .

[10]  G. Papanicolaou,et al.  Stability and Control of Stochastic Systems with Wide-band Noise Disturbances. I , 1978 .

[11]  Herbert A. Simon,et al.  Aggregation of Variables in Dynamic Systems , 1961 .

[12]  Andrew L. Rukhin,et al.  Continuous-Time Markov Chains and Applications: A Singular Perturbation Approach , 2001, Technometrics.

[13]  Qing Zhang,et al.  Asymptotic properties of a singularly perturbed Markov chain with inclusion of transient states , 2000 .

[14]  Harold J. Kushner,et al.  Approximation and Weak Convergence Methods for Random Processes , 1984 .

[15]  Qing Zhang,et al.  Continuous-Time Markov Chains and Applications , 1998 .

[16]  A. A. Pervozvanskiĭ,et al.  Theory of Suboptimal Decisions: Decomposition and Aggregation , 1988 .

[17]  X. Mao Stability of stochastic differential equations with Markovian switching , 1999 .

[18]  M. Mariton,et al.  Robust jump linear quadratic control: A mode stabilizing solution , 1985 .

[19]  D. Sworder,et al.  Feedback control of a class of linear discrete systems with jump parameters and quadratic cost criteria , 1975 .

[20]  N. G. Parke,et al.  Ordinary Differential Equations. , 1958 .

[21]  K. Loparo,et al.  Stochastic stability properties of jump linear systems , 1992 .

[22]  Gang George Yin,et al.  Asymptotic Expansions of Singularly Perturbed Systems Involving Rapidly Fluctuating Markov Chains , 1996, SIAM J. Appl. Math..