Estimation of Motions in Color Image Sequences Using Hypercomplex Fourier Transforms

Although the motion estimation problem has been extensively studied, most of the proposed estimation approaches deal mainly with monochrome videos. The most usual way to apply them also in color image sequences is to process each color channel separately. A different, more sophisticated approach is...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 18; no. 1; pp. 168 - 187
Main Authors Alexiadis, D.S., Sergiadis, G.D.
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.01.2009
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Although the motion estimation problem has been extensively studied, most of the proposed estimation approaches deal mainly with monochrome videos. The most usual way to apply them also in color image sequences is to process each color channel separately. A different, more sophisticated approach is to process the color channels in a ldquoholisticrdquo manner using quaternions, as proposed by Ell and Sangwine. In this paper, we extend standard spatiotemporal Fourier-based approaches to handle color image sequences, using the hypercomplex Fourier transform. We show that translational motions are manifested as energy concentration along planes in the hypercomplex 3D Fourier domain and we describe a methodology to estimate the motions, based on this property. Furthermore, we compare the three-channels-separately approach with our approach and we show that the computational effort can be reduced by a factor of 1/3, using the hypercomplex Fourier transform. Also, we propose a simple, accompanying method to extract the moving objects in the hypercomplex Fourier domain. Our experimental results on synthetic and natural images verify our arguments throughout the paper.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2008.2007603