Reverse engineering expert visual observations: From fixations to the learning of spatial filters with a neural-gas algorithm
•A step towards replicating the process of an expert studying a relevant image.•Utilise eye-tracking technology to discover the experts fixation points.•Growing-neural-gas algorithm to find a filter-set suitable for identifying fixations.•Applied to new images, can predict where the expert is likely...
Saved in:
Published in | Expert systems with applications Vol. 40; no. 17; pp. 6707 - 6712 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Amsterdam
Elsevier Ltd
01.12.2013
Elsevier |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •A step towards replicating the process of an expert studying a relevant image.•Utilise eye-tracking technology to discover the experts fixation points.•Growing-neural-gas algorithm to find a filter-set suitable for identifying fixations.•Applied to new images, can predict where the expert is likely to look.
Human beings can become experts in performing specific vision tasks, for example, doctors analysing medical images, or botanists studying leaves. With sufficient knowledge and experience, people can become very efficient at such tasks. When attempting to perform these tasks with a machine vision system, it would be highly beneficial to be able to replicate the process which the expert undergoes. Advances in eye-tracking technology can provide data to allow us to discover the manner in which an expert studies an image. This paper presents a first step towards utilizing these data for computer vision purposes. A growing-neural-gas algorithm is used to learn a set of Gabor filters which give high responses to image regions which a human expert fixated on. These filters can then be used to identify regions in other images which are likely to be useful for a given vision task. The algorithm is evaluated by learning filters for locating specific areas of plant leaves. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 0957-4174 1873-6793 |
DOI: | 10.1016/j.eswa.2013.05.042 |