A Novel Deep Convolutional Neural Network Architecture Based on Transfer Learning for Handwritten Urdu Character Recognition
Deep convolutional neural networks (CNN) have made a huge impact on computer vision and set the state-of-the-art in providing extremely definite classification results. For character recognition, where the training images are usually inadequate, mostly transfer learning of pre-trained CNN is often u...
Saved in:
Published in | Tehnički vjesnik Vol. 27; no. 4; pp. 1160 - 1165 |
---|---|
Main Authors | , |
Format | Journal Article Paper |
Language | English |
Published |
Slavonski Baod
Josipa Jurja Strossmayer University of Osijek
15.08.2020
Strojarski fakultet u Slavonskom Brodu; Fakultet elektrotehnike, računarstva i informacijskih tehnologija Osijek; Građevinski i arhitektonski fakultet Osijek Faculty of Mechanical Engineering in Slavonski Brod, Faculty of Electrical Engineering in Osijek, Faculty of Civil Engineering in Osijek |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Deep convolutional neural networks (CNN) have made a huge impact on computer vision and set the state-of-the-art in providing extremely definite classification results. For character recognition, where the training images are usually inadequate, mostly transfer learning of pre-trained CNN is often utilized. In this paper, we propose a novel deep convolutional neural network for handwritten Urdu character recognition by transfer learning three pre-trained CNN models. We fine-tuned the layers of these pre-trained CNNs so as to extract features considering both global and local details of the Urdu character structure. The extracted features from the three CNN models are concatenated to train with two fully connected layers for classification. The experiment is conducted on UNHD, EMILLE, DBAHCL, and CDB/Farsi dataset, and we achieve 97.18% average recognition accuracy which outperforms the individual CNNs and numerous conventional classification methods. |
---|---|
Bibliography: | 242316 |
ISSN: | 1330-3651 1848-6339 |
DOI: | 10.17559/TV-20190319095323 |