Optimal Control of a Discrete-Time Stochastic System with a Probabilistic Criterion and a Non-fixed Terminal Time

This paper considers an optimal control problem for a discrete-time stochastic system with the probability of first reaching the boundaries of a given domain as the optimality criterion. Dynamic programming-based sufficient conditions of optimality are formulated and proved. The isobells of levels 1...

Full description

Saved in:
Bibliographic Details
Published inAutomation and remote control Vol. 81; no. 12; pp. 2143 - 2159
Main Author Azanov, V. M.
Format Journal Article
LanguageEnglish
Published Moscow Pleiades Publishing 01.12.2020
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper considers an optimal control problem for a discrete-time stochastic system with the probability of first reaching the boundaries of a given domain as the optimality criterion. Dynamic programming-based sufficient conditions of optimality are formulated and proved. The isobells of levels 1 and 0 of the Bellman function are used for obtaining two-sided estimates of the right-hand side of the dynamic programming equation, two-sided estimates of the Bellman function, and two-sided estimates of the optimal-value function of the probabilistic criterion. A suboptimal control design method is proposed. The conditions of equivalence to an optimal control problem with a probabilistic terminal criterion are established. An illustrative example is given.
ISSN:0005-1179
1608-3032
DOI:10.1134/S0005117920120012