Probabilistic and Randomized Methods for Design Under Uncertainty

The editors–authors have collected 16 chapters by 34 leading experts on the mathematical theory of stochastic optimization methods with potential for application to the engineering design of complex systems (e.g., telecom and computer networks and aircraft control systems). Make no mistake, however, this is not an applied text. The emphasis of this book is mathematical theory, and the presentation is highly mathematical and intended for an audience of researchers in the fields of control and optimization. The level of mathematical sophistication is typical of a Journal of the American Statistical Association article, and nearly all chapters are article length and similar in structure: problem introduction, theorems and proofs, examples, and conclusions. This book probably will appeal only to those Technometrics readers who have a control systems or operations research background. Also, if you derive your inspiration primarily from application examples, they are in limited supply here. In the Preface, the editors state that their objective is to discover engineering design parameter settings that produce the most robust (insensitive) system response to process disturbances or loads when the magnitude and description of the disturbances is uncertain. Their subject is the mathematical theory common to control systems and generic decision optimization problems with inexact design data. Their strategy is to bring together researchers from the fields of optimization and control with the intentions of highlighting the opportunities for synergistic interaction between the two fields, and focusing on randomized and probabilistic techniques for solving engineering design problems in the presence of stochastic uncertainty. The result is a mathematical “tour-de-force” of the current state of research in the mathematics of optimization problems in which chance plays a substantial role. Chance may enter these problem formulations through probabilistic constraints or stochastic solution methods (Part I, Chaps. 1–4), the search for robust design outcomes via randomization and sampling methods (Part II, Chaps. 5–9), or the use of probabilistic methods of system identification (fitting time series) and control (Part III, Chaps. 10–16). Chapters 1–4 cover scenario approximations of chance constraints, optimization models with probabilistic constraints, a theoretical framework for comparing several stochastic optimization approaches, and the optimization of risk measures. Chapters 5–9 explore sampled convex programs and probabilistically robust design, randomized constraint sampling applied to the game Tetris, near optimal solutions to least squares problems with stochastic uncertainty, the randomized ellipsoid algorithm for constrained robust least squares problems, and randomized algorithms for semiinfinite programming problems. Chapters 10–16 discuss a learning theory approach to system identification and stochastic adaptive control, probabilistic design of a robust controller using a parameter dependent Lyapunov function, probabilistic robust controller design using the probable near minimax value and randomized algorithms, sampling random transfer functions, nonlinear systems stability via random and quasi-random methods, probabilistic control of nonlinear uncertain systems, and fast randomized algorithms for probabilistic robustness analysis. The writing styles are generally quite readable, although the dense mathematical symbolism varies between chapters and is not generally defined in chapter glossaries. The common reference list is extensive (411 entries!). The index is a scant 2 pages. Graphics are limited (21 figures), but useful. English usage is uniformly good and the editors have made it understandable to a mathematically sophisticated audience. Although only Chapters 13–15 discuss detailed examples, the range of application areas cited throughout include matching cash flows to demands, truss topology, robust antenna array design, portfolio optimization, robust estimation, control system analysis and synthesis, hard stochastic control, interpolation of interval data, Affine uncertainty, Kalman filter design for uncertain systems, optimal and robust control, optimal experimental design, system reliability, system identification (fitting time series), stability of a tower crane, aircraft lateral motion control, multidisk control problem, linear time-invariant plant transfer functions, mobile robot stability, and hypersonic aircraft control models. I found Chapter 3, by Spall, Hill, and Stark, to be quite useful and the exception to the style of the other chapters with minimal use of symbolism. Their comparison of optimization approaches complements the book by Spall (2003) on stochastic optimization, reviewed by Hesterberg (2004). Chapter 7, by the editors, was quite approachable because of the more typical algebraic development, numerical examples, and graphics. Their discussion of the learning theory approach to stochastic uncertain least squares problems was straightforward and informative. Chapter 15, by Wang and Stengel, on hypersonic aircraft control models was interesting and informative, perhaps because of the examples. The remaining chapters are sufficiently abstract to appeal primarily to researchers and subject matter experts. It appears that the editors have succeeded in their intention, although it was certainly eye-opening to see the current state of research in stochastic optimization theory. The individual chapter authors have contributed a vast reference list that alone is worth the price.

[1]  James C. Spall,et al.  Introduction to stochastic search and optimization - estimation, simulation, and control , 2003, Wiley-Interscience series in discrete mathematics and optimization.