Stochastic Games with Short-Stage Duration

We introduce asymptotic analysis of stochastic games with short-stage duration. The play of stage k, k≥0, of a stochastic game Γδ with stage duration δ is interpreted as the play in time kδ≤t<(k+1)δ and, therefore, the average payoff of the n-stage play per unit of time is the sum of the payoffs in the first n stages divided by nδ, and the λ-discounted present value of a payoff g in stage k is λkδg. We define convergence, strong convergence, and exact convergence of the data of a family (Γδ)δ>0 as the stage duration δ goes to 0, and study the asymptotic behavior of the value, optimal strategies, and equilibrium. The asymptotic analogs of the discounted, limiting-average, and uniform equilibrium payoffs are defined. Convergence implies the existence of an asymptotic discounted equilibrium payoff, strong convergence implies the existence of an asymptotic limiting-average equilibrium payoff, and exact convergence implies the existence of an asymptotic uniform equilibrium payoff.