Multi-Frame Based Homography Estimation for Video Stitching in Static Camera Environments

In this paper, a multi-frame based homography estimation method is proposed for video stitching in static camera environments. A homography that is robust against spatio-temporally induced noise can be estimated by intervals, using feature points extracted during a predetermined time interval. The f...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 20; no. 1; p. 92
Main Authors Park, Keon-Woo, Shim, Yoo-Jeong, Lee, Myeong-Jin, Ahn, Heejune
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 22.12.2019
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, a multi-frame based homography estimation method is proposed for video stitching in static camera environments. A homography that is robust against spatio-temporally induced noise can be estimated by intervals, using feature points extracted during a predetermined time interval. The feature point with the largest blob response in each quantized location bin, a representative feature point, is used for matching a pair of video sequences. After matching representative feature points from each camera, the homography for the interval is estimated by random sample consensus (RANSAC) on the matched representative feature points, with their chances of being sampled proportional to their numbers of occurrences in the interval. The performance of the proposed method is compared with that of the per-frame method by investigating alignment distortion and stitching scores for daytime and noisy video sequence pairs. It is shown that alignment distortion in overlapping regions is reduced and the stitching score is improved by the proposed method. The proposed method can be used for panoramic video stitching with static video cameras and for panoramic image stitching with less alignment distortion.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s20010092