Automatic Extraction of Semantic Relationships from Images Using Ontologies and SVM Classifiers
Extracting high-level semantic concepts from low-level visual features of images is a very challenging research. Although traditional machine learning approaches just extract fragmentary information of images, their performance is still not satisfying. In this paper, we propose a novel system that a...
Saved in:
Published in | Multimedia Content Analysis and Mining Vol. 4577; pp. 184 - 194 |
---|---|
Main Authors | , , , |
Format | Book Chapter |
Language | English |
Published |
Germany
Springer Berlin / Heidelberg
2007
Springer Berlin Heidelberg |
Series | Lecture Notes in Computer Science |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Extracting high-level semantic concepts from low-level visual features of images is a very challenging research. Although traditional machine learning approaches just extract fragmentary information of images, their performance is still not satisfying. In this paper, we propose a novel system that automatically extracts high-level concepts such as spatial relationships or natural-enemy relationships from images using combination of ontologies and SVM classifiers. Our system consists of two phases. In the first phase, visual features are mapped to intermediate-level concepts (e.g, yellow, 45 angular stripes). And then, a set of these concepts are classified into relevant object concepts (e.g, tiger) by using SVM-classifiers. In this phase, revision module which improves the accuracy of classification is used. In the second phase, based on extracted visual information and domain ontology, we deduce semantic relationships such as spatial/natural-enemy relationships between multiple objects in an image. Finally, we evaluate the proposed system using color images including about 20 object concepts. |
---|---|
Bibliography: | This work was supported by Korea Research Foundation Grant funded by the Korea Government(MOEHRD) (KRF-2006-521-D00457). |
ISBN: | 9783540734161 3540734163 |
ISSN: | 0302-9743 1611-3349 |
DOI: | 10.1007/978-3-540-73417-8_25 |