An online learning update modeling approach for aerial visual tracking

Since it is a crucial part of many computer vision applications, including video surveillance, human–computer interfaces, driver assistance systems, robots, etc., visual target tracking has been a popular study area during the past few decades. Target tracking has been the focus of many studies, and...

Full description

Saved in:
Bibliographic Details
Published inJournal of optics (New Delhi) Vol. 53; no. 1; pp. 676 - 686
Main Author Wang, Limei
Format Journal Article
LanguageEnglish
Published New Delhi Springer India 01.02.2024
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Since it is a crucial part of many computer vision applications, including video surveillance, human–computer interfaces, driver assistance systems, robots, etc., visual target tracking has been a popular study area during the past few decades. Target tracking has been the focus of many studies, and a substantial body of the literature has resulted. Visual tracking is a difficult subject, even though several algorithms have been created. A straightforward approach like template matching could be effective when the target’s look does not vary much and its velocity changes smoothly. The target’s appearance can be varying in indoor and outdoor environment and under camera and target motion conditions. Therefore, it is still required to investigate advanced methodologies to deal with existing challenges in aerial video tracking. In this study, a visual target tracking is proposed to deal with the visual tracking difficulty in aerial videos. The proposed tracking method is learning-based which taken advantage of prior knowledge which is indicated by the top-down. Using this combination, the proposed tracking method is used parallel processing and multi-instance learning model for multi-target detection and appearance variations difficulty. Experimental results show that the proposed method presents better results compared to other methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0972-8821
0974-6900
DOI:10.1007/s12596-023-01209-7