Gender classification using dorsal NIR hand veins imaging
Gender classification is soft biometric trait that can be used for identification, surveillance, databases-indexing, social/medical services, forensic anthropology, and e-marketing. Gender classification has been addressed with face, iris, fingerprints, hand veins of fingers/palm, and hand geometry....
Saved in:
Published in | Neural computing & applications Vol. 37; no. 22; pp. 18275 - 18301 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
London
Springer London
01.08.2025
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Gender classification is soft biometric trait that can be used for identification, surveillance, databases-indexing, social/medical services, forensic anthropology, and e-marketing. Gender classification has been addressed with face, iris, fingerprints, hand veins of fingers/palm, and hand geometry. The hand has several biometric modalities such as dorsal and palmar hand veins, fingers veins, hand geometry, and fingerprints. Near-infrared dorsal hand veins were not addressed for gender classification. They are hidden under skin and very hard to fake unlike face and fingerprints. They can be imaged using cheap cameras. In this paper, we acquired novel multimodal dataset for 200 subjects, for the purpose of gender classification. The acquired near-infrared image modalities include dorsal hand veins in fisted and landed with wrist positions and hand geometry. At least 5 images per each hand were acquired over several sessions. We also acquired 7 facial expressions and side-view color images. Image enhancements, region of interest extraction, texture mapping, and data augmentation have been proposed. Sixty-three individual convolutional neural networks models were proposed, trained, validated, and tested, to classify gender from twenty-one hand veins image modalities, and for three split ratios. Highest obtained testing accuracy for all individual models is 96.97% with fivefold cross-validation mean value of 95.33%. Higher accuracies were found for females and for left hands as 98.73% and 97.35%. Individual model's classifier probability scores were fused, and fusion accuracies are over 99% for some fusion rules. Superior performance accuracies highlight the potential of our proposed gender classification system for biometric applications, particularly where gender classification accuracy is crucial. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-025-11363-7 |