Vision Saliency Feature Extraction Based on Multi-scale Tensor Region Covariance

In the process of extracting image saliency features by using regional covariance, the low-level higher-order data are dealt with by vectorization, however, the structure of the data (color, intensity, direction) may be lost in the process, leading to a poorer representation and overall performance...

Full description

Saved in:
Bibliographic Details
Published inInformation Retrieval Vol. 10390; pp. 185 - 197
Main Authors Wang, Shimin, Wang, Mingwen, Ye, Jihua, Jie, Anquan
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2017
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In the process of extracting image saliency features by using regional covariance, the low-level higher-order data are dealt with by vectorization, however, the structure of the data (color, intensity, direction) may be lost in the process, leading to a poorer representation and overall performance degradation. In this paper we introduce an approach for sparse representation of region covariance that will preserve the inherent structure of the image. This approach firstly calculates the image low-level data (color, intensity, direction), and then uses multi-scale transform to extract the multi-scale features for constructing tensor space, at last by using tensor sparse coding the image bottom features are extracted from region covariance. In the paper, it compares the experimental results with the commonly used feature extraction algorithms’ results. The experimental results show that the proposed algorithm is closer to the actual boundary of the object and achieving better results.
Bibliography:This work was supported in part by the National Natural Science Foundation of China under Grant 61650105, Grant 61462042, and Grant 61462045, in part by the Jiangxi Education Science and Technology Research Project under Grant GJJ160324.
ISBN:3319686984
9783319686981
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-68699-8_15