Nonlinear and Optimal Control Systems

From the Publisher: Nonlinear and Optimal Control Systems offers a self-contained introduction to analysis techniques used in the design of nonlinear and optimal feedback control systems, with a solid emphasis on the fundamental topics of stability, controllability, optimality, and the corresponding geometry. The book develops and presents these key subjects in a unified fashion. An integrated approach is used to develop stability theory, function minimizing feedback controls, optimal controls, and differential game theory. Starting with a background on differential equations, this accessible text examines nonlinear dynamical systems and nonlinear control systems, including basic results in nonlinear parameter optimization and parametric two-player games. Lyapunov stability theory and control system design are discussed in detail, followed by in-depth coverage of the controllability minimum principle and other important controllability concepts. The optimal control (Pontryagin's) minimum principle is developed and then applied to optimal control problems and the design of optimal controllers. Nonlinear and Optimal Control Systems features examples and exercises taken from a wide range of disciplines and contexts - from engineering control designs to biological, economic, and other systems. Numerical algorithms are provided for solving problems in optimization and control, as well as simulation of systems using nonlinear differential equations. Readers may choose to develop their own code from these algorithms or solve problems with the help of commercial software programs. Providing readers with a sturdy foundation in nonlinear and optimal control system design and application, this new resource is a valuable asset to advanced students and professional engineers in many different fields.

[1]  R. E. Kalman,et al.  Control System Analysis and Design Via the “Second Method” of Lyapunov: I—Continuous-Time Systems , 1960 .

[2]  Benjamin C. Kuo,et al.  AUTOMATIC CONTROL SYSTEMS , 1962, Universum:Technical sciences.

[3]  J. J. Rodden Numerical applications of Lyapunov stability theory , 1964 .

[4]  D. G. Schultz,et al.  The Generation of Liapunov Functions , 1965 .

[5]  B. Goh Necessary Conditions for Singular Extremals Involving Multiple Control Variables , 1966 .

[6]  C. W. Gear The numerical integration of ordinary differential equations , 1967 .

[7]  Donald R. Snow Determining Reachable Regions and Optimal Controls , 1967 .

[8]  O. Rössler An equation for continuous chaos , 1976 .

[9]  Benjamin C. Kuo,et al.  Digital Control Systems , 1977 .

[10]  M. Ardema Singular perturbations and the sounding rocket problem , 1979 .

[11]  George Leitmann,et al.  On ultimate boundedness control of uncertain systems in the absence of matching assumptions , 1982 .

[12]  C. Sparrow The Lorenz Equations: Bifurcations, Chaos, and Strange Attractors , 1982 .

[13]  P. Gill,et al.  Model Building and Practical Aspects of Nonlinear Programming , 1985 .

[14]  Peter B Schroeder,et al.  Plotting the mandelbrot set , 1986 .

[15]  E. Zeidler,et al.  Fixed-point theorems , 1986 .

[16]  Hüseyin Koçak Deterministic Chaos: An Introduction; Second revised Revised Edition (Heinz Georg Schuster) , 1989, SIAM Rev..

[17]  Shouchuan Hu Differential equations with discontinuous right-hand sides☆ , 1991 .

[18]  Ernest L Baker,et al.  Modeling and Optimization of Shaped Charge Liner Collapse and Jet Formation , 1993 .

[19]  Albert Y. Zomaya,et al.  Feedback Control of Minimum-Time Optimal Control Problems using Neural Networks , 1993 .

[20]  W. Grantham,et al.  Differential equation for continuous normalization , 1994 .

[21]  Tamer Başar,et al.  H1-Optimal Control and Related Minimax Design Problems , 1995 .