A discrete time population control model

Abstract A discrete time stochastic model is assumed to describe the growth behavior of a natural animal, pest, or even epidemic population. Periodically, control action representing, for example, harvesting or exterminating can be taken to modify the future growth of the population. Dynamic programming can be used to generate optimal control policies for models where growth and control produce economically measurable benefit or cost. This paper uses dynamic programming to find conditions under which the classically simple “single critical number policy” is optimal. These conditions are then expanded to include a wide range of control models and the optimal policy obtained is characterized as a “single critical function policy.”