The Early Japanese Books Text Line Segmentation base on Image Processing and Deep Learning

Early books record a lot of information such as politics, economy, culture, history at that time. Understanding the early books may help us know the history, becomes an important research recently. However, in Japan, a lot of early books are described by Kuzushi character, which is not used now and...

Full description

Saved in:
Bibliographic Details
Published inInternational Conference on Advanced Mechatronic Systems pp. 299 - 304
Main Authors Lyu, Bing, Akama, Ryo, Tomiyama, Hiroyuki, Meng, Lin
Format Conference Proceeding
LanguageEnglish
Japanese
Published IEEE 01.08.2019
Subjects
Online AccessGet full text
ISSN2325-0690
DOI10.1109/ICAMechS.2019.8861597

Cover

More Information
Summary:Early books record a lot of information such as politics, economy, culture, history at that time. Understanding the early books may help us know the history, becomes an important research recently. However, in Japan, a lot of early books are described by Kuzushi character, which is not used now and only few of specialists can understand it. Hence a large number of early Japanese books still do not be understood. Currently, researchers are trying to understand the early Japanese books with the assist of computer such as image processing and deep learning. However, these books are composed of articles and pictures, sometimes there are in the same page. This case increases the difficult of character recognition, and lets researchers have to segment the text line and separate the articles and pictures previously. This paper aims to segment the text line from the image of scanned early Japanese book by deep learning. For achieving better accuracy, this paper also proposes an image processing method which uses the projection profile for deleting the frame noise. The experimental results show that Precision, Recall and F Value achieve 95.2%, 98.3% and 96.6% respectively, and prove the effectiveness of our method.
ISSN:2325-0690
DOI:10.1109/ICAMechS.2019.8861597