Represent and fuse bimodal biometric images at the feature level: complex-matrix-based fusion scheme

Multibiometrics can obtain a higher accuracy than the single biometrics by simultaneously using multiple biometric traits of the subject. We note that biometric traits are usually in the form of images. Thus, how to properly fuse the information of multiple biometric images of the subject for authen...

Full description

Saved in:
Bibliographic Details
Published inOptical Engineering Vol. 49; no. 3; p. 037002
Main Authors Xu, Yong, Zhang, David
Format Journal Article
LanguageEnglish
Published 01.03.2010
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Multibiometrics can obtain a higher accuracy than the single biometrics by simultaneously using multiple biometric traits of the subject. We note that biometric traits are usually in the form of images. Thus, how to properly fuse the information of multiple biometric images of the subject for authentication is crucial for multibiometrics. We propose a novel image-based linear discriminant analysis (IBLDA) approach to fuse two biometric traits (i.e., bimodal biometric images) of the same subject in the form of matrix at the feature level. IBLDA first integrates two biometric traits of one subject into a complex matrix and then directly extracts low-dimensional features for the integrated biometric traits. IBLDA also enables more information to be exploited than the matching score level fusion and the decision level fusion. Compared to linear discriminant analysis (LDA), IBLDA has the following advantages: First, it can overcome the small sample size problem that conventional LDA usually suffers from. Second, IBLDA solves the eigenequation at a low computational cost. Third, when storing the scatter matrices IBLDA will not bring as heavy a memory burden as conventional LDA. We also clearly show the theoretical foundation of the proposed method. The experiment result shows that the proposed method can obtain a high classification accuracy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0091-3286
1560-2303
DOI:10.1117/1.3359514