Performance validation of clustering algorithms using selection of attributes and application of filters in terms of data reduction
Summary Clustering in unsupervised learning is a method to find inherent group of set of unlabeled data. Such set of groups are termed as Clusters. Grouping of datasets in to clusters involves minimization of the interclass similarity and maximization of the intraclass similarity. Therefore, cluster...
Saved in:
Published in | Concurrency and computation Vol. 34; no. 8 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Hoboken
Wiley Subscription Services, Inc
10.04.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Summary
Clustering in unsupervised learning is a method to find inherent group of set of unlabeled data. Such set of groups are termed as Clusters. Grouping of datasets in to clusters involves minimization of the interclass similarity and maximization of the intraclass similarity. Therefore, clustering the large datasets introduces the concept called Data Reduction. Data reduction is a simple process of identifying a relevant feature subset, which is enough to represent the selected large datasets. Here, the data reduction is done in terms of applying two clustering algorithms like Expectation and Maximization‐EM and K‐Means with filters in unsupervised category (i) Normalize filter in Instance level and (ii) Randomize filter in attribute level and selection of attributes. The results shows that the Livestock dataset applied with selection of attributes and preprocessing filter named Normalize in attribute level and Randomize in instance level on which the EM and K‐Means algorithm are executed, and the results are compared and analyzed in terms of data reduction. Therefore, the K‐Means algorithm applied with Randomize filter at instance level and selection of attributes and finally K‐Means algorithm show good performance in case of large datasets when compared with other clustering algorithms. |
---|---|
ISSN: | 1532-0626 1532-0634 |
DOI: | 10.1002/cpe.5364 |