A diffusion process model for the optimal operation of a reservoir system

The water level in a reservoir is modelled as a controlled diffusion process on a compact interval of the real line. The problem is to control the water discharge rate so as to minimise the expected costs, which depend upon the histories of the water levels and release rates. The form of the optimal control is studied for two general classes of reservoir control problems.