Optimal Control of a Discrete-Time Stochastic System with a Probabilistic Criterion and a Non-fixed Terminal Time
This paper considers an optimal control problem for a discrete-time stochastic system with the probability of first reaching the boundaries of a given domain as the optimality criterion. Dynamic programming-based sufficient conditions of optimality are formulated and proved. The isobells of levels 1...
Saved in:
Published in | Automation and remote control Vol. 81; no. 12; pp. 2143 - 2159 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
Moscow
Pleiades Publishing
01.12.2020
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper considers an optimal control problem for a discrete-time stochastic system with the probability of first reaching the boundaries of a given domain as the optimality criterion. Dynamic programming-based sufficient conditions of optimality are formulated and proved. The isobells of levels 1 and 0 of the Bellman function are used for obtaining two-sided estimates of the right-hand side of the dynamic programming equation, two-sided estimates of the Bellman function, and two-sided estimates of the optimal-value function of the probabilistic criterion. A suboptimal control design method is proposed. The conditions of equivalence to an optimal control problem with a probabilistic terminal criterion are established. An illustrative example is given. |
---|---|
ISSN: | 0005-1179 1608-3032 |
DOI: | 10.1134/S0005117920120012 |