Optimal control of systems with noisy memory and BSDEs with Malliavin derivatives

In this article we consider a stochastic optimal control problem where the dynamics of the state process, \(X(t)\), is a controlled stochastic differential equation with jumps, delay and \emph{noisy memory}. The term noisy memory is, to the best of our knowledge, new. By this we mean that the dynami...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Dahl, Kristina R, Mohammed, Salah-Eldin A, Øksendal, Bernt, Røse, Elin
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 27.08.2015
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this article we consider a stochastic optimal control problem where the dynamics of the state process, \(X(t)\), is a controlled stochastic differential equation with jumps, delay and \emph{noisy memory}. The term noisy memory is, to the best of our knowledge, new. By this we mean that the dynamics of \(X(t)\) depend on \(\int_{t-\delta}^t X(s) dB(s)\) (where \(B(t)\) is a Brownian motion). Hence, the dependence is noisy because of the Brownian motion, and it involves memory due to the influence from the previous values of the state process. We derive necessary and sufficient maximum principles for this stochastic control problem in two different ways, resulting in two sets of maximum principles. The first set of maximum principles is derived using Malliavin calculus techniques, while the second set comes from reduction to a discrete delay optimal control problem, and application of previously known results by Øksendal, Sulem and Zhang. The maximum principles also apply to the case where the controller only has partial information, in the sense that the admissible controls are adapted to a sub-\(\sigma\)-algebra of the natural filtration.
ISSN:2331-8422