Real time object detection and trackingsystem for video surveillance system

This paper introduces a system capable of real-time video surveillance in low-end edge computing environment by combining object detection tracking algorithm. Recently, the accuracy of object detection has been improved due to the performance of approaches based on deep learning algorithm such as re...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 80; no. 3; pp. 3981 - 3996
Main Authors Jha, Sudan, Seo, Changho, Yang, Eunmok, Joshi, Gyanendra Prasad
Format Journal Article
LanguageEnglish
Published New York Springer US 01.01.2021
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper introduces a system capable of real-time video surveillance in low-end edge computing environment by combining object detection tracking algorithm. Recently, the accuracy of object detection has been improved due to the performance of approaches based on deep learning algorithm such as region-based convolutional network, which has two stages for inferencing. One-stage detection algorithms such as single shot detector and you only look once (YOLO) have been developed at the expense of some accuracy and can be used for real-time systems. However, high-performance hardware such as general-purpose graphics processing unit is required to achieve excellent object detection performance and speed. In this study, we propose an approach called N-YOLO which is instead of resizing image step in YOLO algorithm, it divides into fixed size images used in YOLO and merges detection results of each divided sub-image with inference results at different times using correlation-based tracking algorithm the amount of computation for object detection and tracking can be significantly reduced. In addition, we propose a system that can guarantee real-time performance in various edge computing environments by adaptively controlling the cycle of object detection and tracking.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-020-09749-x