Spatio-Temporal Processing for Automatic Vehicle Detection in Wide-Area Aerial Video

Vehicle detection in aerial videos often requires post-processing to eliminate false detections. This paper presents a spatio-temporal processing scheme to improve automatic vehicle detection performance by replacing the thresholding step of existing detection algorithms with multi-neighborhood hyst...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 8; pp. 199562 - 199572
Main Authors Gao, Xin, Szep, Jeno, Satam, Pratik, Hariri, Salim, Ram, Sundaresh, Rodriguez, Jeffrey J.
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Vehicle detection in aerial videos often requires post-processing to eliminate false detections. This paper presents a spatio-temporal processing scheme to improve automatic vehicle detection performance by replacing the thresholding step of existing detection algorithms with multi-neighborhood hysteresis thresholding for foreground pixel classification. The proposed scheme also performs spatial post-processing, which includes morphological opening and closing to shape and prune the detected objects, and temporal post-processing to further reduce false detections. We evaluate the performance of the proposed spatial processing on two local aerial video datasets and one parking vehicle dataset, and the performance of the proposed spatio-temporal processing scheme on five local aerial video datasets and one public dataset. Experimental evaluation shows that the proposed schemes improve vehicle detection performance for each of the nine algorithms when evaluated on seven datasets. Overall, the use of the proposed spatio-temporal processing scheme improves average F-score to above 0.8 and achieves an average reduction of 83.8% in false positives.
Bibliography:USDOE National Nuclear Security Administration (NNSA)
NA0003946
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3033466