Optimal Control of Hybrid Systems with an Infinite Set of Discrete States

Hybrid control systems are described by a family of continuous subsystems and a set of logic rules for switching between them. This paper concerns a broad class of optimization problems for hybrid systems, in which the continuous subsystems are modelled as differential inclusions. The formulation allows endpoint constraints and a general objective function that includes “transaction costs” associated with abrupt changes of discrete and continuous states, and terms associated with continuous control action as well as the terminal value of the continuous state. In consequence of the endpoint constraints, the value function m ay be discontinuous. It is shown that the collection of value functions (associated with all discrete states) is the unique lower semicontinuous solution of a system of generalized Bensoussan-Lions type quasi-variational inequalities, suitably interpreted for nondifferentiable, extended valued functions. It is also shown how optimal strategies and value functions are related. The proof techniques are system theoretic, i.e., based on the construction of state trajectories with suitable properties. A distinctive feature of the analysis is that it permits an infinite set of discrete states.

[1]  J. Aubin Mathematical methods of game and economic theory , 1979 .

[2]  F. Rampazzo,et al.  Dynamic Programming for Nonlinear Systems Driven by Ordinaryand Impulsive Controls , 1996 .

[3]  J. Yong Systems governed by ordinary differential equations with continuous, switching and impulse controls , 1989 .

[4]  L. Evans,et al.  Optimal Switching for Ordinary Differential Equations , 1984 .

[5]  E. Barron,et al.  Semicontinuous Viscosity Solutions For Hamilton–Jacobi Equations With Convex Hamiltonians , 1990 .

[6]  P. Wolenski,et al.  Proximal Analysis and the Minimal Time Function , 1998 .

[7]  V. Borkar,et al.  A unified framework for hybrid control: model and optimal control theory , 1998, IEEE Trans. Autom. Control..

[8]  Jean-Pierre Aubin Lyapunov functions for impulse and hybrid control systems , 2000, Proceedings of the 39th IEEE Conference on Decision and Control (Cat. No.00CH37187).

[9]  Jean-Pierre Aubin,et al.  Viability theory , 1991 .

[10]  A. Bensoussan,et al.  Contrôle impulsionnel et inéquations quasi variationnelles , 1982 .

[11]  H. Frankowska Lower semicontinuous solutions of Hamilton-Jacobi-Bellman equations , 1993 .

[12]  Yu. S. Ledyaev,et al.  Nonsmooth analysis and control theory , 1998 .

[13]  Richard B. Vinter,et al.  Optimal Control , 2000 .

[14]  G. Barles,et al.  Deterministic Impulse Control Problems , 1985 .

[15]  Jean-Pierre Aubin,et al.  Impulse differential inclusions: a viability approach to hybrid systems , 2002, IEEE Trans. Autom. Control..