Implementation of YOLOv5-based Forest Fire Smoke Monitoring Model with Increased Recognition of Unstructured Objects by Increasing Self-learning data

A society will lose a lot of something in this field when the forest fire broke out. If a forest fire can be detected in advance, damage caused by the spread of forest fires can be prevented early. So, we studied how to detect forest fires using CCTV currently installed. In this paper, we present a...

Full description

Saved in:
Bibliographic Details
Published inInternational Journal of Advanced Culture Technology(IJACT) Vol. 10; no. 4; pp. 536 - 546
Main Authors 도군우, 김민영, 장시웅
Format Journal Article
LanguageEnglish
Published 국제문화기술진흥원 31.12.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A society will lose a lot of something in this field when the forest fire broke out. If a forest fire can be detected in advance, damage caused by the spread of forest fires can be prevented early. So, we studied how to detect forest fires using CCTV currently installed. In this paper, we present a deep learning-based model through efficient image data construction for monitoring forest fire smoke, which is unstructured data, based on the deep learning model YOLOv5. Through this study, we conducted a study to accurately detect forest fire smoke, one of the amorphous objects of various forms, in YOLOv5. In this paper, we introduce a method of self-learning by producing insufficient data on its own to increase accuracy for unstructured object recognition. The method presented in this paper constructs a dataset with a fixed labelling position for images containing objects that can be extracted from the original image, through the original image and a model that learned from it. In addition, by training the deep learning model, the performance(mAP) was improved, and the errors occurred by detecting objects other than the learning object were reduced, compared to the model in which only the original image was learned.
Bibliography:http://www.ipact.kr/eng/iconf/ijact/sub05.php
ISSN:2288-7202
2288-7318
DOI:10.17703/IJACT.2022.10.4.536