An Impulse Control with Quadratic Drift Term

This paper mainly discusses an impulse control problem where the drift of dynamics is not only quadratic but also constrained in the control variable. We establish the HJB equation to get the closed form of the value function. At the same time, we obtain the optimal control in some case. Introduction The impulse control problem has been introduced by Bensoussan and Lions in 1973[1,2], and Richard developed this model in 1977[3].From then on this problem has been widely studied. For example, in 1986 Liu Kunhui [9,10] discussed more general state and cost structure based on Richard model, gave existence of optimal control. Recently the application of impulse control approach to mathematical finance, economics and personnel management has been investigated among many studies: Cadenillas and Zapatero[4] considered foreign exchange market with state controlled by geometric Brownian motion and cost function was linear; Masamitsu Uhnishi and Motoh Tsujimura[5] considered same state but cost function was quadratic; Gabriela Mundaca and Bernt ksendal [6] discussed an impulse control that drift term was an increasing concave function F about  t t r r − , especially for F=ax, gave explicit solution of optimal control. The contribution of this paper is for an impulse control with quadratic drift term we give existence of optimal control and explicit solution of optimal control. Mathematical Model Let ( , , , ) t F P F Ω be a complete probability space with filtratio 0 { } t t F ≥ , which is assumed to be right-continuous and t F contains all the P-null sets in F.We assume that a standard Brownian motion { ( ) : 0} W W t t = ≥ with respect to 0 { } t t F ≥ is given on this probability space. An impulse control for the system is double sequences, 1 2 1 2 ( , , , , ; , , , , )     i i T T T ν ξ ξ ξ = where 1 2 0   i T T T ≤ < < < < is an increasing sequence of stopping times such that n T →∞ as n →∞ . i ξ is a sequence of random variable such that : [ , ) i q ξ Ω→ +∞ : is i T F -measurable, where q> 0 is a small constant which represents the fixed costs for each impulse control. The control process is described as follows: ( ) ( ) 2 ( ) ( ( ) ) ( ) dX t U t aU t dt U t dW t μ d σ = − − + ( ) 1 1 1 ( , ] [0, ], ( ) , 1, 2,3, , (0)    i i i i i i t T T T X T X T i X x R ξ ∞ + = ∈ + = − = = ∈ + (1) Here , , μ σ d are given positive constants. ( )  U is an t F adapted process constrained by 1 2 ( ) k U t k ≤ ≤ for some constants 1 2 1 2 , .(0 ) k k k k < < < ∞ . Admissible stochastic control ( , ) U π ν = consist of a combination of regular control and impulse control such that ( ) ( ) 0, 1, 2,3, i i i X T X T i ξ + = − ≥ = (2) We denote all the class of admissible control by ∏ . 5th International Conference on Mechatronics, Materials, Chemistry and Computer Engineering (ICMMCCE 2017) Copyright © 2017, the Authors. Published by Atlantis Press. This is an open access article under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/). Advances in Engineering Research, volume 141

[1]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.