SlimDeblurGAN-Based Motion Deblurring and Marker Detection for Autonomous Drone Landing

Deep learning-based marker detection for autonomous drone landing is widely studied, due to its superior detection performance. However, no study was reported to address non-uniform motion-blurred input images, and most of the previous handcrafted and deep learning-based methods failed to operate wi...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 20; no. 14; p. 3918
Main Authors Truong, Noi Quang, Lee, Young Won, Owais, Muhammad, Nguyen, Dat Tien, Batchuluun, Ganbayar, Pham, Tuyen Danh, Park, Kang Ryoung
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 14.07.2020
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Deep learning-based marker detection for autonomous drone landing is widely studied, due to its superior detection performance. However, no study was reported to address non-uniform motion-blurred input images, and most of the previous handcrafted and deep learning-based methods failed to operate with these challenging inputs. To solve this problem, we propose a deep learning-based marker detection method for autonomous drone landing, by (1) introducing a two-phase framework of deblurring and object detection, by adopting a slimmed version of deblur generative adversarial network (DeblurGAN) model and a You only look once version 2 (YOLOv2) detector, respectively, and (2) considering the balance between the processing time and accuracy of the system. To this end, we propose a channel-pruning framework for slimming the DeblurGAN model called SlimDeblurGAN, without significant accuracy degradation. The experimental results on the two datasets showed that our proposed method exhibited higher performance and greater robustness than the previous methods, in both deburring and marker detection.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s20143918