Multispectral image fusion for illumination-invariant palmprint recognition

Multispectral palmprint recognition has shown broad prospects for personal identification due to its high accuracy and great stability. In this paper, we develop a novel illumination-invariant multispectral palmprint recognition method. To combine the information from multiple spectral bands, an ima...

Full description

Saved in:
Bibliographic Details
Published inPloS one Vol. 12; no. 5; p. e0178432
Main Authors Lu, Longbin, Zhang, Xinman, Xu, Xuebin, Shang, Dongpeng
Format Journal Article
LanguageEnglish
Published United States Public Library of Science 30.05.2017
Public Library of Science (PLoS)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Multispectral palmprint recognition has shown broad prospects for personal identification due to its high accuracy and great stability. In this paper, we develop a novel illumination-invariant multispectral palmprint recognition method. To combine the information from multiple spectral bands, an image-level fusion framework is completed based on a fast and adaptive bidimensional empirical mode decomposition (FABEMD) and a weighted Fisher criterion. The FABEMD technique decomposes the multispectral images into their bidimensional intrinsic mode functions (BIMFs), on which an illumination compensation operation is performed. The weighted Fisher criterion is to construct the fusion coefficients at the decomposition level, making the images be separated correctly in the fusion space. The image fusion framework has shown strong robustness against illumination variation. In addition, a tensor-based extreme learning machine (TELM) mechanism is presented for feature extraction and classification of two-dimensional (2D) images. In general, this method has fast learning speed and satisfying recognition accuracy. Comprehensive experiments conducted on the PolyU multispectral palmprint database illustrate that the proposed method can achieve favorable results. For the testing under ideal illumination, the recognition accuracy is as high as 99.93%, and the result is 99.50% when the lighting condition is unsatisfied.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
Competing Interests: The authors have declared that no competing interests exist.
Conceptualization: LBL XBX.Data curation: XMZ.Formal analysis: XMZ DPS.Funding acquisition: XMZ.Investigation: LBL DPS.Methodology: LBL XBX.Project administration: XMZ LBL.Resources: XBX.Software: LBL.Supervision: XMZ.Validation: XBX.Visualization: LBL DPS.Writing – original draft: LBL XMZ.Writing – review & editing: LBL XMZ.
ISSN:1932-6203
1932-6203
DOI:10.1371/journal.pone.0178432