Wavelet frame based blind image inpainting

Image inpainting has been widely used in practice to repair damaged/missing pixels of given images. Most of the existing inpainting techniques require knowing beforehand where those damaged pixels are, either given as a priori or detected by some pre-processing. However, in certain applications, suc...

Full description

Saved in:
Bibliographic Details
Published inApplied and computational harmonic analysis Vol. 32; no. 2; pp. 268 - 279
Main Authors Dong, Bin, Ji, Hui, Li, Jia, Shen, Zuowei, Xu, Yuhong
Format Journal Article
LanguageEnglish
Published Elsevier Inc 01.03.2012
Subjects
Online AccessGet full text
ISSN1063-5203
1096-603X
DOI10.1016/j.acha.2011.06.001

Cover

Loading…
More Information
Summary:Image inpainting has been widely used in practice to repair damaged/missing pixels of given images. Most of the existing inpainting techniques require knowing beforehand where those damaged pixels are, either given as a priori or detected by some pre-processing. However, in certain applications, such information neither is available nor can be reliably pre-detected, e.g. removing random-valued impulse noise from images or removing certain scratches from archived photographs. This paper introduces a blind inpainting model to solve this type of problems, i.e., a model of simultaneously identifying and recovering damaged pixels of the given image. A tight frame based regularization approach is developed in this paper for such blind inpainting problems, and the resulted minimization problem is solved by the split Bregman algorithm first proposed by Goldstein and Osher (2009) [1]. The proposed blind inpainting method is applied to various challenging image restoration tasks, including recovering images that are blurry and damaged by scratches and removing image noise mixed with both Gaussian and random-valued impulse noise. The experiments show that our method is compared favorably against many available two-staged methods in these applications.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1063-5203
1096-603X
DOI:10.1016/j.acha.2011.06.001