An Iterative Spanning Forest Framework for Superpixel Segmentation

Superpixel segmentation has emerged as an important research problem in the areas of image processing and computer vision. In this paper, we propose a framework, namely Iterative Spanning Forest (ISF), in which improved sets of connected superpixels (supervoxels in 3D) can be generated by a sequence...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 28; no. 7; pp. 3477 - 3489
Main Authors Vargas-Munoz, John E., Chowdhury, Ananda S., Alexandre, Eduardo B., Galvao, Felipe L., Vechiatto Miranda, Paulo A., Falcao, Alexandre X.
Format Journal Article
LanguageEnglish
Published New York IEEE 01.07.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Superpixel segmentation has emerged as an important research problem in the areas of image processing and computer vision. In this paper, we propose a framework, namely Iterative Spanning Forest (ISF), in which improved sets of connected superpixels (supervoxels in 3D) can be generated by a sequence of image foresting transforms. In this framework, one can choose the most suitable combination of ISF components for a given application-i.e., 1) a seed sampling strategy; 2) a connectivity function; 3) an adjacency relation; and 4) a seed pixel recomputation procedure. The superpixels in ISF structurally correspond to spanning trees rooted at those seeds. We present five ISF-based methods to illustrate different choices for those components. These methods are compared with a number of state-of-the-art approaches with respect to effectiveness and efficiency. Experiments are carried out on several datasets containing 2D and 3D objects with distinct texture and shape properties, including a high-level application, named sky image segmentation. The theoretical properties of ISF are demonstrated in the supplementary material and the results show ISF-based methods rank consistently among the best for all datasets.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2019.2897941