Subspace-Based Convolutional Network for Handwritten Character Recognition
In recent years, several convolutional neural networks-based architectures (CNN) have been proposed for handwritten character recognition. However, most of the conventional proposed models demand large scale training data and long training time to compute the parameters and achieve satisfactory resu...
Saved in:
Published in | 2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR) Vol. 1; pp. 1044 - 1049 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.11.2017
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In recent years, several convolutional neural networks-based architectures (CNN) have been proposed for handwritten character recognition. However, most of the conventional proposed models demand large scale training data and long training time to compute the parameters and achieve satisfactory results. These requirements prevent the use of these methods in a wider range of applications. To solve these problems, we present a novel convolutional network for handwritten character recognition based on subspace method. Our approach lies on the assumption that convolutional kernels can be efficiently learned from subspaces and directly employed to produce high discriminant features in a CNN architecture. When representing each image class by subspaces, we decrease the inter-class similarity, since the subspaces form clusters in a low dimensional space. In order to enlarge the intra-class similarity, we estimate a discriminative space from the training subspaces. By learning convolutional kernels from subspaces we can obtain representative and discriminative information produced by the network with few parameters, creating a light weight network. This use of a flexible architecture and its straightforward implementation make the proposed method quite attractive in practical terms. Our experimental evaluation shows that we achieve competitive results compared to the state-of-the-art methods. |
---|---|
ISSN: | 2379-2140 |
DOI: | 10.1109/ICDAR.2017.173 |