Robust Recognition of Chinese Text from Cellphone-acquired Low-quality Identity Card Images Using Convolutional Recurrent Neural Network

An automatic reading of text from an identity (ID) card image has a wide range of social uses. In this paper, we propose a novel method for Chinese text recognition from ID card images taken by cellphone cameras. The paper has two main contributions: (1) A synthetic data engine based on a conditiona...

Full description

Saved in:
Bibliographic Details
Published inSensors and materials Vol. 33; no. 4; p. 1187
Main Authors Wang, Jianmei, Wu, Ruize, Zhang, Shaoming
Format Journal Article
LanguageEnglish
Published Tokyo MYU Scientific Publishing Division 06.04.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:An automatic reading of text from an identity (ID) card image has a wide range of social uses. In this paper, we propose a novel method for Chinese text recognition from ID card images taken by cellphone cameras. The paper has two main contributions: (1) A synthetic data engine based on a conditional adversarial generative network is designed to generate million-level synthetic ID card text line images, which can not only retain the inherent template pattern of ID card images but also preserve the diversity of synthetic data. (2) An improved convolutional recurrent neural network (CRNN) is presented to increase Chinese text recognition accuracy, in which DenseNet substitutes VGGNet architecture to extract more sophisticated spatial features. The proposed method is evaluated with more than 7000 real ID card text line images. The experimental results demonstrate that the improved CRNN model trained only on the synthetic dataset can increase the recognition accuracy of Chinese text in cellphone-acquired low-quality images. Specifically, compared with the original CRNN, the average character recognition accuracy (CRA) is increased from 96.87 to 98.57% and the line recognition accuracy (LRA) is increased from 65.92 to 90.10%.
ISSN:0914-4935
2435-0869
DOI:10.18494/SAM.2021.2991