A robust approach to reading recognition of pointer meters based on improved mask-RCNN
In this paper, we address a challenging task in real-word applications, i.e., automatic reading recognition for pointer meters, called PRM.11PRM: Pointer merters Recognition based on Mask-RCNN. This application is valuable in the fields of military, industry, and aerospace. However, the accuracy of...
Saved in:
Published in | Neurocomputing (Amsterdam) Vol. 388; pp. 90 - 101 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
07.05.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this paper, we address a challenging task in real-word applications, i.e., automatic reading recognition for pointer meters, called PRM.11PRM: Pointer merters Recognition based on Mask-RCNN. This application is valuable in the fields of military, industry, and aerospace. However, the accuracy of recognizing the readings of pointer meters by machine vision is, oftentimes, affected by several factors, such as uneven illumination in each image, large range variation of illumination in different images, complex backgrounds, tilting of pointer meters, image blur, and scale change, resulting in the recognized readings with unacceptable accuracy. In this paper, a new robust approach to reading recognition of pointer meters is proposed. The proposed method consists of three main contributions: (1) constructing a novel deep learning algorithm in which the PrRoIPooling is used in lieu of the RoiAlign in the existing Mask-RCNN, (2) classifying the type of pointer meters while fitting the pointer binary mask, and (3) calculating the readings of pointer meters by the proposed angle method. In addition, we also report and release a new dataset for the community. Experiments show that the new algorithm can significantly improve the accuracy of the recognized readings of pointer meters, meanwhile, the proposed approach is also robust to the natural environments and computationally efficient. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2020.01.032 |