Extended mean-field control problem with partial observation
We study an extended mean-field control problem with partial observation, where the dynamic of the state is given by a forward-backward stochastic differential equation of McKean-Vlasov type. The cost functional, the state and the observation all depend on the joint distribution of the state and the...
Saved in:
Published in | ESAIM. Control, optimisation and calculus of variations Vol. 28; p. 17 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Les Ulis
EDP Sciences
2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We study an extended mean-field control problem with partial observation, where the dynamic of the state is given by a forward-backward stochastic differential equation of McKean-Vlasov type. The cost functional, the state and the observation all depend on the joint distribution of the state and the control process. Our problem is motivated by the recent popular subject of mean-field games and related control problems of McKean-Vlasov type. We first establish a necessary condition in the form of Pontryagin’s maximum principle for optimality. Then a verification theorem is obtained for optimal control under some convex conditions of the Hamiltonian function. The results are also applied to studying linear-quadratic mean-filed control problem in the type of scalar interaction. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1292-8119 1262-3377 |
DOI: | 10.1051/cocv/2022010 |