Optimal control of stochastic differential equations with random impulses and the Hamilton–Jacobi–Bellman equation

In this article, we study the optimal control of stochastic differential equations with random impulses. We optimize the performance index and add the influence of random impulses to the performance index with a random compensation function. Using the idea of stochastic analysis and dynamic programm...

Full description

Saved in:
Bibliographic Details
Published inOptimal control applications & methods Vol. 45; no. 5; pp. 2113 - 2135
Main Authors Yin, Qian‐Bao, Shu, Xiao‐Bao, Guo, Yu, Wang, Zi‐Yu
Format Journal Article
LanguageEnglish
Published Glasgow Wiley Subscription Services, Inc 01.09.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this article, we study the optimal control of stochastic differential equations with random impulses. We optimize the performance index and add the influence of random impulses to the performance index with a random compensation function. Using the idea of stochastic analysis and dynamic programming principle, a new Hamilton–Jacobi–Bellman (HJB) equation is obtained, and the existence and uniqueness of its viscosity solution are proved. This article studies stochastic control systems with random impulses, and obtains a new type of Hamilton‐Jacobi‐Bellman(HJB) equation based on the dynamic programming principle. Compared to previous performance index, we add a compensation function to optimize the performance index. The article provides some results of the corresponding optimal control theory and proves the existence and uniqueness of viscosity solutions for the HJB equation.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0143-2087
1099-1514
DOI:10.1002/oca.3139