Adaptive Gaussian Inverse Regression with Partially Unknown Operator
This work deals with the ill-posed inverse problem of reconstructing a function f given implicitly as the solution of g = Af, where A is a compact linear operator with unknown singular values and known eigenfunctions. We observe the function g and the singular values of the operator subject to Gauss...
Saved in:
Published in | Communications in statistics. Theory and methods Vol. 42; no. 7; pp. 1343 - 1362 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Philadelphia
Taylor & Francis Group
01.04.2013
Taylor & Francis Ltd |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This work deals with the ill-posed inverse problem of reconstructing a function f given implicitly as the solution of g = Af, where A is a compact linear operator with unknown singular values and known eigenfunctions. We observe the function g and the singular values of the operator subject to Gaussian white noise with respective noise levels ϵ and σ. We develop a minimax theory in terms of both noise levels and propose an orthogonal series estimator attaining the minimax rates. This estimator requires the optimal choice of a dimension parameter depending on certain characteristics of f and A. This work addresses the fully data-driven choice of the dimension parameter combining model selection with Lepski's method. We show that the fully data-driven estimator preserves minimax optimality over a wide range of classes for f and A and noise levels ϵ and σ. The results are illustrated considering Sobolev spaces and mildly and severely ill-posed inverse problems. |
---|---|
Bibliography: | SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 14 ObjectType-Article-2 content type line 23 |
ISSN: | 0361-0926 1532-415X |
DOI: | 10.1080/03610926.2012.731548 |