A Hybrid Rule-Based and Machine Learning System for Arabic Check Courtesy Amount Recognition

Courtesy amount recognition from bank checks is an important application of pattern recognition. Although much progress has been made on isolated digit recognition for Indian digits, there is no work reported in the literature on courtesy amount recognition for Arabic checks using Indian digits. Ara...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 23; no. 9; p. 4260
Main Author Ahmad, Irfan
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 25.04.2023
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Courtesy amount recognition from bank checks is an important application of pattern recognition. Although much progress has been made on isolated digit recognition for Indian digits, there is no work reported in the literature on courtesy amount recognition for Arabic checks using Indian digits. Arabic check courtesy amount recognition comes with its own unique challenges that are not seen in isolated digit recognition tasks and, accordingly, need specific approaches to deal with them. This paper presents an end-to-end system for courtesy amount recognition starting from check images as input to recognizing amounts as a sequence of digits. The system is a hybrid system, combining rule-based modules as well as machine learning modules. For the amount recognition system, both segmentation-based and segmentation-free approaches were investigated and compared. We evaluated our system on the CENPARMI dataset of real bank checks in Arabic. We achieve 67.4% accuracy at the amount level and 87.15% accuracy at the digit level on the test set consisting of 626 check images. The results are presented with detailed analysis, and some possible future work is identified. This work can be used as a baseline to benchmark future research in Arabic check courtesy amount recognition.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s23094260