Multi-scale multi-class conditional generative adversarial network for handwritten character generation

Handwritten character generation is a popular research topic with various applications. Various methods have been proposed in the literatures which are based on methods such as pattern recognition, machine learning, deep learning or others. However, seldom method could generate realistic and natural...

Full description

Saved in:
Bibliographic Details
Published inThe Journal of supercomputing Vol. 75; no. 4; pp. 1922 - 1940
Main Authors Liu, Jin, Gu, Chenkai, Wang, Jin, Youn, Geumran, Kim, Jeong-Uk
Format Journal Article
LanguageEnglish
Published New York Springer US 01.04.2019
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Handwritten character generation is a popular research topic with various applications. Various methods have been proposed in the literatures which are based on methods such as pattern recognition, machine learning, deep learning or others. However, seldom method could generate realistic and natural handwritten characters with a built-in determination mechanism to enhance the quality of generated image and make the observers unable to tell whether they are written by a person. To address these problems, in this paper, we proposed a novel generative adversarial network, multi-scale multi-class generative adversarial network (MSMC-CGAN). It is a neural network based on conditional generative adversarial network (CGAN), and it is designed for realistic multi-scale character generation. MSMC-CGAN combines the global and partial image information as condition, and the condition can also help us to generate multi-class handwritten characters. Our model is designed with unique neural network structures, image features and training method. To validate the performance of our model, we utilized it in Chinese handwriting generation, and an evaluation method called mean opinion score (MOS) was used. The MOS results show that MSMC-CGAN achieved good performance.
ISSN:0920-8542
1573-0484
DOI:10.1007/s11227-017-2218-0