Integrating distance metric learning and cluster-level constraints in semi-supervised clustering

Semi-supervised clustering has been widely explored in the last years. In this paper, we present HCAC-ML (Hierarchical Confidence-based Active Clustering with Metric Learning), an innovative approach for this task which employs distance metric learning through cluster-level constraints. HCAC-ML is b...

Full description

Saved in:
Bibliographic Details
Published in2017 International Joint Conference on Neural Networks (IJCNN) pp. 4118 - 4125
Main Authors Magalhaes Nogueira, Bruno, Benevides Tomas, Yuri Karan, Marcondes Marcacini, Ricardo
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2017
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Semi-supervised clustering has been widely explored in the last years. In this paper, we present HCAC-ML (Hierarchical Confidence-based Active Clustering with Metric Learning), an innovative approach for this task which employs distance metric learning through cluster-level constraints. HCAC-ML is based on the HCAC algorithm, an state-of-the-art algorithm for hierarchical semi-supervised clustering that uses an active learning approach for inserting cluster-level constraints. These constraints are presented to a variation of ITML (Information-theoretic Metric Learning) algorithm to learn a Mahalanobis-like distance function. We compared HCAC-ML with other semi-supervised clustering algorithms in 26 different datasets. Results indicate that HCAC-ML outperforms other algorithms in most of the scenarios, but specially when the number of constraints is small. This makes HCAC-ML useful in practical applications.
ISSN:2161-4407
DOI:10.1109/IJCNN.2017.7966376