Chapter 10 – Optimal Control
暂无分享,去创建一个
Publisher Summary
Many problems in engineering can be solved by minimizing a measure of cost or maximizing a measure of performance. The designer must select a suitable performance measure based on his or her understanding of the problem to include the most important performance criteria and reflect their relative importance. The designer must also select a mathematical form of the function that makes solving the optimization problem tractable. This chapter introduces optimal control theory for discrete-time systems. It begins with unconstrained optimization of a cost function and then generalizes to optimization with equality constraints. It also considers the problem of minimizing a cost function or performance measure; then extends the solution to problems with equality constraints. Following this, it covers the optimization or optimal control of discrete-time systems. It then specializes to the linear quadratic regulator and obtains the optimality conditions for a finite and for an infinite planning horizon. In addition, this chapter addresses the regulator problem where the system is required to track a nonzero constant signal.