Troop camouflage detection based on deep action learning

Detecting troop camouflage on the battlefield is crucial to beat or decide in critical situations to survive. This paper proposed a hybrid model based on deep action learning for camouflage recognition and detection. To involve deep action learning in this proposed system, deep learning based on you...

Full description

Saved in:
Bibliographic Details
Published inIAES International Journal of Artificial Intelligence Vol. 11; no. 3; p. 859
Main Authors Muslikhin, Muslikhin, Nasuha, Aris, Arifin, Fatchul, Suprapto, Suprapto, Winursito, Anggun
Format Journal Article
LanguageEnglish
Published Yogyakarta IAES Institute of Advanced Engineering and Science 01.09.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Detecting troop camouflage on the battlefield is crucial to beat or decide in critical situations to survive. This paper proposed a hybrid model based on deep action learning for camouflage recognition and detection. To involve deep action learning in this proposed system, deep learning based on you only look once (YOLOv3) with SquezeeNet and the fourth steps on action learning were engaged. Following the successful formulation of the learning cycle, an instrument examines the environment and performance in action learning with qualitative weightings; specific target detection experiments with view angle, target localization, and the firing point procedure were performed. For each deep action learning cycle, the complete process is divided into planning, acting, observing, and reflecting. If the results do not meet the minimal passing grade after the first cycle, the cycle will be repeated until the system succeeds in the firing point. Furthermore, this study found that deep action learning could enhance intelligence over earlier camouflage detection methods, while maintaining acceptable error rates. As a result, deep action learning could be used in armament systems if the environment is properly identified.
ISSN:2089-4872
2252-8938
2089-4872
DOI:10.11591/ijai.v11.i3.pp859-871