Adaptive Gaussian inverse regression with partially unknown operator

This work deals with the ill-posed inverse problem of reconstructing a function \(f\) given implicitly as the solution of \(g = Af\), where \(A\) is a compact linear operator with unknown singular values and known eigenfunctions. We observe the function \(g\) and the singular values of the operator...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Johannes, Jan, Schwarz, Maik
Format Paper Journal Article
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 05.04.2012
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This work deals with the ill-posed inverse problem of reconstructing a function \(f\) given implicitly as the solution of \(g = Af\), where \(A\) is a compact linear operator with unknown singular values and known eigenfunctions. We observe the function \(g\) and the singular values of the operator subject to Gaussian white noise with respective noise levels \(\varepsilon\) and \(\sigma\). We develop a minimax theory in terms of both noise levels and propose an orthogonal series estimator attaining the minimax rates. This estimator requires the optimal choice of a dimension parameter depending on certain characteristics of \(f\) and \(A\). This work addresses the fully data-driven choice of the dimension parameter combining model selection with Lepski's method. We show that the fully data-driven estimator preserves minimax optimality over a wide range of classes for \(f\) and \(A\) and noise levels \(\varepsilon\) and \(\sigma\). The results are illustrated considering Sobolev spaces and mildly and severely ill-posed inverse problems.
Bibliography:SourceType-Working Papers-1
ObjectType-Working Paper/Pre-Print-1
content type line 50
ISSN:2331-8422
DOI:10.48550/arxiv.1204.1226