Deep Online Video Stabilization

Video stabilization technique is essential for most hand-held captured videos due to high-frequency shakes. Several 2D-, 2.5D- and 3D-based stabilization techniques are well studied, but to our knowledge, no solutions based on deep neural networks had been proposed. The reason for this is mostly the...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Wang, Miao, Guo-Ye, Yang, Jin-Kun, Lin, Shamir, Ariel, Song-Hai, Zhang, Shao-Ping, Lu, Shi-Min, Hu
Format Paper Journal Article
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 22.02.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Video stabilization technique is essential for most hand-held captured videos due to high-frequency shakes. Several 2D-, 2.5D- and 3D-based stabilization techniques are well studied, but to our knowledge, no solutions based on deep neural networks had been proposed. The reason for this is mostly the shortage of training data, as well as the challenge of modeling the problem using neural networks. In this paper, we solve the video stabilization problem using a convolutional neural network (ConvNet). Instead of dealing with offline holistic camera path smoothing based on feature matching, we focus on low-latency real-time camera path smoothing without explicitly representing the camera path. Our network, called StabNet, learns a transformation for each input unsteady frame progressively along the time-line, while creating a more stable latent camera path. To train the network, we create a dataset of synchronized steady/unsteady video pairs via a well designed hand-held hardware. Experimental results shows that the proposed online method (without using future frames) performs comparatively to traditional offline video stabilization methods, while running about 30 times faster. Further, the proposed StabNet is able to handle night-time and blurry videos, where existing methods fail in robust feature matching.
ISSN:2331-8422
DOI:10.48550/arxiv.1802.08091