Minimum variance control of first-order systems with a constraint on the input amplitude

In this note we consider the problem of controlling a first-order stochastic system in such a way that the stationary variance of the state is minimized in the case when the amplitude of the input signal is constrained. It is found that the problem has an analytical solution in the form of a nonlinear feedback strategy. The result is applied to amplitude constrained minimum variance control of first-order transfer function ARMA processes with time delay.