1. Continuity of Filters for Discrete-Time Control Problems Defined by Explicit Equations
- Author
-
Feinberg, Eugene A., Ishizawa, Sayaka, Kasyanov, Pavlo O., and Kraemer, David N.
- Subjects
Mathematics - Optimization and Control ,Primary 90C40, Secondary 62C05, 90C39 - Abstract
Discrete time control systems whose dynamics and observations are described by stochastic equations are common in engineering, operations research, health care, and economics. For example, stochastic filtering problems are usually defined via stochastic equations. These problems can be reduced to Markov decision processes (MDPs) whose states are posterior state distributions, and such MDPs are sometimes called filters. This paper investigates sufficient conditions on transition and observation functions for the original problems to guarantee weak continuity of the transition probabilities of the filter MDP. Under mild conditions on cost functions, weak continuity implies the existence of optimal policies minimizing the expected total costs, the validity of optimality equations, and convergence of value iterations to optimal values. This paper uses recent results on weak continuity of filters for partially observable MDPs defined by transition and observation probabilities. It develops a criterion of weak continuity of transition probabilities and a sufficient condition for continuity in total variation of transition probabilities. The results are illustrated with applications to filtering problems.
- Published
- 2023