Extended mean-field control problem with partial observation

We study an extended mean-field control problem with partial observation, where the dynamic of the state is given by a forward-backward stochastic differential equation of McKean-Vlasov type. The cost functional, the state and the observation all depend on the joint distribution of the state and the...

Full description

Saved in:
Bibliographic Details
Published inESAIM. Control, optimisation and calculus of variations Vol. 28; p. 17
Main Authors Nie, Tianyang, Yan, Ke
Format Journal Article
LanguageEnglish
Published Les Ulis EDP Sciences 2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We study an extended mean-field control problem with partial observation, where the dynamic of the state is given by a forward-backward stochastic differential equation of McKean-Vlasov type. The cost functional, the state and the observation all depend on the joint distribution of the state and the control process. Our problem is motivated by the recent popular subject of mean-field games and related control problems of McKean-Vlasov type. We first establish a necessary condition in the form of Pontryagin’s maximum principle for optimality. Then a verification theorem is obtained for optimal control under some convex conditions of the Hamiltonian function. The results are also applied to studying linear-quadratic mean-filed control problem in the type of scalar interaction.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1292-8119
1262-3377
DOI:10.1051/cocv/2022010