We consider a class of singularly perturbed zero-sum differential games with piecewise deterministic dynamics, where the changes from one structure (for the dynamics) to another are governed by a finite-state Markov process. Player 1 controls the continuous dynamics, whereas Player 2 controls the rate of transition for the finite-state Markov process; both have access to the states of both processes. Player 1 wishes to minimize a given quantity. For player 2, we consider two possible scenarios: one in which it wishes to minimize the same quantity (team framework), and one in which it wishes to maximize it (zero sum game). The transition rates of the Markov process are fast, of the order of 1/ε. To solve the above problem, we use the dynamic programming approach. In particular, we study the asymptotic properties of the underlying system for sufficiently small ε. The viscosity solution method is employed to verify the convergence of the value function, which allows us to obtain the convergence in a general setting and helps us to characterize the structure of the limit system. We apply this to the special case of linear quadratic games with jump parameters, which allows us to obtain an explicit solution for the limiting problem.
|Number of pages||6|
|Journal||Proceedings of the IEEE Conference on Decision and Control|
|Publication status||Published - 2000|