YOLO based deep learning on needle-type dashboard recognition for autopilot maneuvering system

Developing a fully automatic auxiliary flying system with robot maneuvering is feasible. This study develops a control vision system that can read all kind of needle-type meters. The vision device in this study implements a modified YOLO-based object detection model to recognize the airspeed reading...

Full description

Saved in:
Bibliographic Details
Published inMeasurement and control (London) Vol. 55; no. 7-8; pp. 567 - 582
Main Authors Chang, Chia-Ming, Liou, Yuan-Dean, Huang, Yi-Cheng, Shen, Shang-En, Yu, PeiJou, Chuang, TingHsueh, Chiou, Shean-Juinn
Format Journal Article
LanguageEnglish
Published London, England SAGE Publications 01.07.2022
Sage Publications Ltd
SAGE Publishing
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Developing a fully automatic auxiliary flying system with robot maneuvering is feasible. This study develops a control vision system that can read all kind of needle-type meters. The vision device in this study implements a modified YOLO-based object detection model to recognize the airspeed readings from the needle-type dashboard. With this approach, meter information in the cockpit is replaced by a single camera and a powerful edge-computer for future autopilot maneuvering purpose. A modified YOLOv4-tiny model by adding the Spatial Pyramid Pooling (SPP) and the Bidirectional Feature Pyramid Network (BAFPN) to the Neck region of the convolutional neural networks (CNN) structure is implemented. The Taguchi method for acquiring a set of optimum hyperparameters for the CNN is applied. An improved deep learning network with higher Mean Average precision (mAP) compared with conventional YOLOv4-tiny and possessing a higher Frames Per Second (FPS) value than YOLOv4 is deployed successfully. Established a self-control system using a camera to receive airspeed indications from the designed virtual needle-type dashboard. Moreover, the dashboard’s pointer is controlled by applying the proposed control method, which contains PID control in addition to the pointer’s rotation angle recognition. A modified YOLOv4-tiny model with a fabricated system for visual dynamical recognition control is implemented successfully. The feasibility of bettering mean accuracy precision and frame per second in achieving autopilot maneuvering is verified.
ISSN:0020-2940
2051-8730
DOI:10.1177/00202940221115199