Development Of A Vision- based Anti-drone Identification Friend Or Foe Model To Recognize Birds And Drones Using Deep Learning

ABSTRACTRecently, the growing use of drones has paved the way for limitless applications in all the domains. However, their malicious exploitations have affected the airspace safety, making them double-edged weapons. Therefore, intelligent anti-drone systems capable of recognizing and neutralizing a...

Full description

Saved in:
Bibliographic Details
Published inApplied artificial intelligence Vol. 38; no. 1
Main Authors Ghazlane, Yasmine, Gmira, Maha, Medromi, Hicham
Format Journal Article
LanguageEnglish
Published Taylor & Francis Group 31.12.2024
Online AccessGet full text

Cover

Loading…
More Information
Summary:ABSTRACTRecently, the growing use of drones has paved the way for limitless applications in all the domains. However, their malicious exploitations have affected the airspace safety, making them double-edged weapons. Therefore, intelligent anti-drone systems capable of recognizing and neutralizing airborne targets become highly required. In the existing literature, most of the attention has been centered on recognizing drones as unique airborne target, whereas the real challenge is to distinguish between drones and non-drone targets. To address this issue, this study develops an Identification Friend or Foe (IFF) model able to classify the aerial targets in foe or friend categories by determining whether the aerial target is a drone or bird, respectively. To achieve this objective, artificial intelligence and computer vision approaches have been combined through transfer learning, data augmentation and other techniques in our model. Another contribution of this work is the study of the impact of depth on the classification performance, which is demonstrated through our experiments. A comparison is performed based on eight models, where EfficientNetB6 shows the best results with 98.12% accuracy, 98.184% precision, 98.115% F1 score and 99.85% Area Under Curve (AUC). The computational results demonstrate the practicality of the developed model.
ISSN:0883-9514
1087-6545
DOI:10.1080/08839514.2024.2318672