An Advanced CNN Based Iris Recognition and Segmentation for Visible Spectrum Images
Iris recognition is a biometrical identifying technique, its intricate patterns are distinctive and durable. It is deemed to be today's most reliable biometric technology. Various characteristics and strategies for iris recognition were proposed over the years. The main vision is to develop a d...
Saved in:
Published in | 2022 International Conference on Advancement in Electrical and Electronic Engineering (ICAEEE) pp. 1 - 5 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
24.02.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Iris recognition is a biometrical identifying technique, its intricate patterns are distinctive and durable. It is deemed to be today's most reliable biometric technology. Various characteristics and strategies for iris recognition were proposed over the years. The main vision is to develop a durable and viable system for recognizing the visible spectrum iris images in a low-cost approach, as most available solutions deploy Near Infrared (NIR) cameras for the capture of images of the iris. An erroneous segmentation of the iris may destabilize the whole iris recognition technique. Therefore, a novel technique is presented for segmenting the iris using Circular Hough Transform (CHT) and Canny Edge Detection for the real iris patch. During recognition, a Convolutional Neural Network (CNN) model is introduced to adapt to features through backpropagation with the use of several building blocks such as 2D convolution layers, max-pooling layers, dropout layer and dense layer. This study is mostly on implementing a CNN model to categorize every individual in the whole dataset. In session 1 and session 2, the proposed system obtained promising test accuracy of 95.20% and 99.28%. Several of the leading techniques have been surpassed by this approach. The framework has been primarily tested on the Ubiris v1 and IITD Iris public datasets. |
---|---|
DOI: | 10.1109/ICAEEE54957.2022.9836333 |