A Discrete-Time Population-Control Model with Setup Cost

A discrete-time stochastic model is often used to describe a natural animal, pest, or epidemic population. Control action, representing harvesting, exterminating, etc., can be taken periodically to reduce the current population level, and so modify its future growth. Dynamic programming can be used to determine optimal control policies for models where growth and control produce economically measurable benefits and/or costs. When controlling action incurs a setup charge plus a cost component linear in the amount of state reduction produced, the optimal policy is found to be characterized by a pair sn, Sn, where reduction is made in period n to state sn if the native population is found to be above state Sn. Analogy with inventory theory is exploited in proving the result.