Unsupervised learning approach to attention-path planning for large-scale environment classification
An unsupervised attention-path planning algorithm is proposed and applied to large unknown area classification with small field-of-view cameras. Attention-path planning is formulated as the sequential feature selection problem that greedily finds a sequence of attentions to obtain more informative o...
Saved in:
Published in | 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 1447 - 1452 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.09.2014
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | An unsupervised attention-path planning algorithm is proposed and applied to large unknown area classification with small field-of-view cameras. Attention-path planning is formulated as the sequential feature selection problem that greedily finds a sequence of attentions to obtain more informative observations, yielding faster training and higher accuracies. In order to find the near-optimal attention-path, adaptive submodular optimization is employed, where the objective function for the internal belief is adaptive submodular and adaptive monotone. First, the amount of information of attention areas is modeled as the dissimilarity variance among the environment data set. With this model, the information gain function is defined as a function of variance reduction that has been shown to be submodular and monotone in many cases. Furthermore, adapting to increasing numbers of observations, each information gain for attention areas is iteratively updated by discarding the non-informative prior knowledge, enabling to maximize the expected information gain. The effectiveness of the proposed algorithm is verified through experiments that can significantly enhance the environment classification accuracy, with reduced number of limited field of view observations. |
---|---|
ISSN: | 2153-0858 2153-0866 |
DOI: | 10.1109/IROS.2014.6942747 |