Generation of All-in-Focus Images by Noise-Robust Selective Fusion of Limited Depth-of-Field Images

The limited depth-of-field of some cameras prevents them from capturing perfectly focused images when the imaged scene covers a large distance range. In order to compensate for this problem, image fusion has been exploited for combining images captured with different camera settings, thus yielding a...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 22; no. 3; pp. 1242 - 1251
Main Authors Pertuz, S., Puig, D., Garcia, M. A., Fusiello, A.
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.03.2013
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The limited depth-of-field of some cameras prevents them from capturing perfectly focused images when the imaged scene covers a large distance range. In order to compensate for this problem, image fusion has been exploited for combining images captured with different camera settings, thus yielding a higher quality all-in-focus image. Since most current approaches for image fusion rely on maximizing the spatial frequency of the composed image, the fusion process is sensitive to noise. In this paper, a new algorithm for computing the all-in-focus image from a sequence of images captured with a low depth-of-field camera is presented. The proposed approach adaptively fuses the different frames of the focus sequence in order to reduce noise while preserving image features. The algorithm consists of three stages: 1) focus measure; 2) selectivity measure; 3) and image fusion. An extensive set of experimental tests has been carried out in order to compare the proposed algorithm with state-of-the-art all-in-focus methods using both synthetic and real sequences. The obtained results show the advantages of the proposed scheme even for high levels of noise.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ObjectType-Article-2
ObjectType-Feature-1
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2012.2231087