The MinMax k-Means clustering algorithm
Applying k-Means to minimize the sum of the intra-cluster variances is the most popular clustering approach. However, after a bad initialization, poor local optima can be easily obtained. To tackle the initialization problem of k-Means, we propose the MinMax k-Means algorithm, a method that assigns...
Saved in:
Published in | Pattern recognition Vol. 47; no. 7; pp. 2505 - 2516 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Kidlington
Elsevier Ltd
01.07.2014
Elsevier |
Subjects | |
Online Access | Get full text |
ISSN | 0031-3203 1873-5142 |
DOI | 10.1016/j.patcog.2014.01.015 |
Cover
Loading…
Abstract | Applying k-Means to minimize the sum of the intra-cluster variances is the most popular clustering approach. However, after a bad initialization, poor local optima can be easily obtained. To tackle the initialization problem of k-Means, we propose the MinMax k-Means algorithm, a method that assigns weights to the clusters relative to their variance and optimizes a weighted version of the k-Means objective. Weights are learned together with the cluster assignments, through an iterative procedure. The proposed weighting scheme limits the emergence of large variance clusters and allows high quality solutions to be systematically uncovered, irrespective of the initialization. Experiments verify the effectiveness of our approach and its robustness over bad initializations, as it compares favorably to both k-Means and other methods from the literature that consider the k-Means initialization problem.
•We propose the MinMax k-Means algorithm to minimize the maximum intra-cluster variance objective.•Weights are assigned to the clusters relative to their intra-cluster variance.•Our method prevents the occurrence of clusters with large intra-cluster variances in the solution.•Our method systematically uncovers high quality solutions, irrespective of the initialization.•MinMax k-Means constitutes a sound approach for initializing k-Means. |
---|---|
AbstractList | Applying k-Means to minimize the sum of the intra-cluster variances is the most popular clustering approach. However, after a bad initialization, poor local optima can be easily obtained. To tackle the initialization problem of k-Means, we propose the MinMax k-Means algorithm, a method that assigns weights to the clusters relative to their variance and optimizes a weighted version of the k-Means objective. Weights are learned together with the cluster assignments, through an iterative procedure. The proposed weighting scheme limits the emergence of large variance clusters and allows high quality solutions to be systematically uncovered, irrespective of the initialization. Experiments verify the effectiveness of our approach and its robustness over bad initializations, as it compares favorably to both k-Means and other methods from the literature that consider the k-Means initialization problem. Applying k-Means to minimize the sum of the intra-cluster variances is the most popular clustering approach. However, after a bad initialization, poor local optima can be easily obtained. To tackle the initialization problem of k-Means, we propose the MinMax k-Means algorithm, a method that assigns weights to the clusters relative to their variance and optimizes a weighted version of the k-Means objective. Weights are learned together with the cluster assignments, through an iterative procedure. The proposed weighting scheme limits the emergence of large variance clusters and allows high quality solutions to be systematically uncovered, irrespective of the initialization. Experiments verify the effectiveness of our approach and its robustness over bad initializations, as it compares favorably to both k-Means and other methods from the literature that consider the k-Means initialization problem. •We propose the MinMax k-Means algorithm to minimize the maximum intra-cluster variance objective.•Weights are assigned to the clusters relative to their intra-cluster variance.•Our method prevents the occurrence of clusters with large intra-cluster variances in the solution.•Our method systematically uncovers high quality solutions, irrespective of the initialization.•MinMax k-Means constitutes a sound approach for initializing k-Means. |
Author | Tzortzis, Grigorios Likas, Aristidis |
Author_xml | – sequence: 1 givenname: Grigorios surname: Tzortzis fullname: Tzortzis, Grigorios email: gtzortzi@cs.uoi.gr – sequence: 2 givenname: Aristidis surname: Likas fullname: Likas, Aristidis email: arly@cs.uoi.gr |
BackLink | http://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=28417331$$DView record in Pascal Francis |
BookMark | eNqFkD1PwzAURS1UJErhHzBkQbAk-MV2kjIgoYovqRVLmS3HeWldUqfYLoJ_T6qUhYFKV3rDO_cO55QMbGuRkAugCVDIblbJRgXdLpKUAk8odBFHZAhFzmIBPB2QIaUMYpZSdkJOvV9RCnn3GJKr-RKjmbEz9RW9xzNU1ke62fqAzthFpJpF60xYrs_Ica0aj-f7OyJvjw_zyXM8fX16mdxPY81pGmKseAmY1kU5rjPgGqEEyHimxLisRU3zcSZ0VhSC05qJSuclzUShylwLLDFjbESu-92Naz-26INcG6-xaZTFdusl5IwCcOD5YVR0GMt5AR16uUeV16qpnbLaeLlxZq3ct0wL3u2yHXfbc9q13juspTZBBdPa4JRpJFC5Ey5Xshcud8IlhS6iK_M_5d_9A7W7voad1k-DTnpt0GqsjEMdZNWa_wd-APpgm-c |
CODEN | PTNRA8 |
CitedBy_id | crossref_primary_10_1016_j_cie_2020_106290 crossref_primary_10_1109_ACCESS_2024_3350442 crossref_primary_10_1007_s10044_017_0673_0 crossref_primary_10_1002_cpe_8395 crossref_primary_10_1016_j_is_2023_102178 crossref_primary_10_1016_j_eswa_2021_115424 crossref_primary_10_1016_j_eswa_2021_116084 crossref_primary_10_1016_j_jag_2020_102154 crossref_primary_10_3390_ijerph192315509 crossref_primary_10_1080_03610918_2024_2387287 crossref_primary_10_1016_j_ins_2015_03_062 crossref_primary_10_1016_j_asoc_2019_02_038 crossref_primary_10_1016_j_patcog_2016_08_022 crossref_primary_10_1016_j_actamat_2020_05_001 crossref_primary_10_1002_sam_11379 crossref_primary_10_1109_ACCESS_2019_2951373 crossref_primary_10_1109_TNNLS_2023_3282938 crossref_primary_10_1002_ecs2_3143 crossref_primary_10_1016_j_asoc_2021_108005 crossref_primary_10_1049_cmu2_12746 crossref_primary_10_1016_j_patcog_2018_02_015 crossref_primary_10_1016_j_knosys_2018_01_031 crossref_primary_10_1007_s13042_019_01027_z crossref_primary_10_1109_TKDE_2022_3155450 crossref_primary_10_1007_s42461_020_00226_5 crossref_primary_10_1088_1755_1315_46_1_012029 crossref_primary_10_1109_TIP_2017_2748885 crossref_primary_10_1007_s00521_020_05471_9 crossref_primary_10_1016_j_apacoust_2017_06_009 crossref_primary_10_1016_j_forpol_2024_103208 crossref_primary_10_1007_s41870_022_01088_4 crossref_primary_10_1177_1550147717728627 crossref_primary_10_3390_e21101013 crossref_primary_10_1016_j_ins_2022_09_044 crossref_primary_10_1109_ACCESS_2024_3450965 crossref_primary_10_1155_2016_4606384 crossref_primary_10_55195_jscai_1560068 crossref_primary_10_1007_s00521_024_10706_0 crossref_primary_10_1016_j_eswa_2023_120633 crossref_primary_10_1080_03610926_2021_1872639 crossref_primary_10_1016_j_artmed_2021_102214 crossref_primary_10_1016_j_simpat_2015_03_007 crossref_primary_10_1016_j_chaos_2021_111494 crossref_primary_10_1021_acs_iecr_7b04851 crossref_primary_10_1109_ACCESS_2019_2951596 crossref_primary_10_1016_j_patcog_2024_110639 crossref_primary_10_1109_ACCESS_2020_3007498 crossref_primary_10_1007_s11042_019_7659_4 crossref_primary_10_1016_j_eng_2023_11_026 crossref_primary_10_18697_ajfand_96_19775 crossref_primary_10_1007_s00500_020_04988_4 crossref_primary_10_1002_widm_1275 crossref_primary_10_1080_00949655_2014_1000900 crossref_primary_10_1016_j_asoc_2016_09_001 crossref_primary_10_1080_17538947_2023_2251436 crossref_primary_10_18267_j_aip_223 crossref_primary_10_1007_s11042_018_7100_4 crossref_primary_10_1016_j_artmed_2019_05_002 crossref_primary_10_1111_bmsp_12329 crossref_primary_10_1007_s43684_023_00055_5 crossref_primary_10_1108_JAMR_07_2021_0242 crossref_primary_10_1016_j_neucom_2015_07_096 crossref_primary_10_3934_aci_2023006 crossref_primary_10_1155_2022_5499213 crossref_primary_10_1177_1687814020968323 crossref_primary_10_1016_j_patrec_2017_10_031 crossref_primary_10_1016_j_eswa_2019_07_034 crossref_primary_10_1007_s00357_022_09415_x crossref_primary_10_4018_IJISMD_2019070101 crossref_primary_10_1364_OE_482003 crossref_primary_10_1016_j_eswa_2024_125714 crossref_primary_10_1016_j_eng_2019_07_005 crossref_primary_10_1016_j_compeleceng_2018_04_023 crossref_primary_10_1007_s11042_019_7183_6 crossref_primary_10_1016_j_ejor_2023_05_020 crossref_primary_10_1016_j_eswa_2016_03_008 crossref_primary_10_20517_jmi_2024_85 crossref_primary_10_1007_s10586_017_0868_6 crossref_primary_10_3390_e22080902 crossref_primary_10_1016_j_neucom_2019_05_056 crossref_primary_10_1007_s11634_018_0337_y crossref_primary_10_1007_s00180_024_01534_w crossref_primary_10_1007_s13369_020_04620_5 crossref_primary_10_15407_kvt203_01_077 crossref_primary_10_1007_s00180_024_01459_4 crossref_primary_10_3390_drones9020090 crossref_primary_10_1016_j_jpdc_2018_03_009 crossref_primary_10_1109_TCSVT_2023_3267895 crossref_primary_10_59782_sidr_v4i1_76 crossref_primary_10_1016_j_livsci_2022_104946 crossref_primary_10_1007_s00521_022_07595_6 crossref_primary_10_1016_j_asoc_2016_11_045 crossref_primary_10_1007_s00357_018_9296_4 crossref_primary_10_1016_j_jag_2019_101910 crossref_primary_10_15541_jim20220149 crossref_primary_10_1016_j_egyr_2020_10_020 crossref_primary_10_1016_j_patcog_2016_03_010 crossref_primary_10_1109_ACCESS_2020_2987345 crossref_primary_10_1007_s40745_024_00572_x crossref_primary_10_1109_ACCESS_2017_2688477 crossref_primary_10_1186_s40064_016_3329_4 crossref_primary_10_3390_s17102226 crossref_primary_10_1007_s10044_021_00981_1 crossref_primary_10_4028_www_scientific_net_AMM_740_624 crossref_primary_10_1109_JIOT_2018_2823498 crossref_primary_10_3390_rs14030642 crossref_primary_10_1016_j_ins_2021_02_061 crossref_primary_10_3233_JCM_204699 crossref_primary_10_1007_s00521_017_3119_0 crossref_primary_10_3390_rs14041042 crossref_primary_10_1051_e3sconf_202126701054 crossref_primary_10_32604_cmc_2024_052114 crossref_primary_10_1155_2021_3248834 crossref_primary_10_32604_iasc_2021_019067 crossref_primary_10_1016_j_dsm_2023_06_005 crossref_primary_10_1051_matecconf_201817601019 crossref_primary_10_5194_amt_8_1951_2015 crossref_primary_10_1155_2022_3278395 crossref_primary_10_4018_IJEHMC_313191 crossref_primary_10_46300_9101_2020_14_20 crossref_primary_10_1155_2015_180749 crossref_primary_10_1109_ACCESS_2019_2943498 crossref_primary_10_1016_j_scriptamat_2024_116240 crossref_primary_10_1007_s11047_016_9542_9 crossref_primary_10_1109_TKDE_2023_3242306 crossref_primary_10_1109_TCYB_2019_2916196 crossref_primary_10_1007_s10044_021_01045_0 crossref_primary_10_1007_s11042_018_6324_7 crossref_primary_10_1016_j_energy_2024_132249 crossref_primary_10_1155_2022_6854344 crossref_primary_10_1109_ACCESS_2024_3461798 crossref_primary_10_1016_j_asoc_2018_07_026 crossref_primary_10_1007_s10489_022_04044_8 crossref_primary_10_1016_j_jhydrol_2024_131161 crossref_primary_10_1109_TNNLS_2023_3300916 crossref_primary_10_1049_iet_ipr_2018_5956 crossref_primary_10_1016_j_ins_2020_04_012 crossref_primary_10_1371_journal_pone_0201874 crossref_primary_10_1007_s12065_019_00221_w crossref_primary_10_3390_sym15091679 crossref_primary_10_1016_j_eswa_2018_09_047 crossref_primary_10_3389_feart_2022_1076999 crossref_primary_10_1109_TIP_2018_2878959 |
Cites_doi | 10.1162/089976698300017467 10.1016/j.patcog.2008.04.004 10.1023/A:1024016609528 10.1016/j.eswa.2012.07.021 10.1109/TIT.1982.1056489 10.1016/S0167-8655(99)00069-0 10.1023/B:VISI.0000029664.99615.94 10.1016/S0031-3203(02)00060-2 10.1109/NAFIPS.2000.877408 10.1145/1014052.1014118 10.1145/1015330.1015408 10.1109/ICDM.2001.989507 10.1016/j.patcog.2007.05.018 10.1109/TNN.2009.2019722 10.1007/978-3-642-15883-4_29 10.1109/TPAMI.2005.95 10.1109/TNN.2005.845141 10.1109/TNN.2004.824416 10.1016/j.patcog.2010.10.018 10.1126/science.1136800 10.1109/ICDM.2010.57 10.3233/IDA-2007-11402 10.1109/IJCNN.2008.4634069 |
ContentType | Journal Article |
Copyright | 2014 Elsevier Ltd 2015 INIST-CNRS |
Copyright_xml | – notice: 2014 Elsevier Ltd – notice: 2015 INIST-CNRS |
DBID | AAYXX CITATION IQODW 7SC 8FD JQ2 L7M L~C L~D |
DOI | 10.1016/j.patcog.2014.01.015 |
DatabaseName | CrossRef Pascal-Francis Computer and Information Systems Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
DatabaseTitle | CrossRef Computer and Information Systems Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Advanced Technologies Database with Aerospace ProQuest Computer Science Collection Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Computer and Information Systems Abstracts Computer and Information Systems Abstracts |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science Applied Sciences |
EISSN | 1873-5142 |
EndPage | 2516 |
ExternalDocumentID | 28417331 10_1016_j_patcog_2014_01_015 S0031320314000338 |
GroupedDBID | --K --M -D8 -DT -~X .DC .~1 0R~ 123 1B1 1RT 1~. 1~5 29O 4.4 457 4G. 53G 5VS 7-5 71M 8P~ 9JN AABNK AACTN AAEDT AAEDW AAIAV AAIKJ AAKOC AALRI AAOAW AAQFI AAQXK AAXUO AAYFN ABBOA ABEFU ABFNM ABFRF ABHFT ABJNI ABMAC ABTAH ABXDB ABYKQ ACBEA ACDAQ ACGFO ACGFS ACNNM ACRLP ACZNC ADBBV ADEZE ADJOM ADMUD ADMXK ADTZH AEBSH AECPX AEFWE AEKER AENEX AFKWA AFTJW AGHFR AGUBO AGYEJ AHHHB AHJVU AHZHX AIALX AIEXJ AIKHN AITUG AJBFU AJOXV ALMA_UNASSIGNED_HOLDINGS AMFUW AMRAJ AOUOD ASPBG AVWKF AXJTR AZFZN BJAXD BKOJK BLXMC CS3 DU5 EBS EFJIC EFLBG EJD EO8 EO9 EP2 EP3 F0J F5P FD6 FDB FEDTE FGOYB FIRID FNPLU FYGXN G-Q G8K GBLVA GBOLZ HLZ HVGLF HZ~ H~9 IHE J1W JJJVA KOM KZ1 LG9 LMP LY1 M41 MO0 N9A O-L O9- OAUVE OZT P-8 P-9 P2P PC. Q38 R2- RIG RNS ROL RPZ SBC SDF SDG SDP SDS SES SEW SPC SPCBC SST SSV SSZ T5K TN5 UNMZH VOH WUQ XJE XPP ZMT ZY4 ~G- AATTM AAXKI AAYWO AAYXX ABDPE ABWVN ACRPL ACVFH ADCNI ADNMO AEIPS AEUPX AFJKZ AFPUW AGCQF AGQPQ AIGII AIIUN AKBMS AKRWK AKYEP ANKPU APXCP CITATION EFKBS BNPGV IQODW SSH 7SC 8FD AFXIZ AGRNS JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c402t-ed4b1e2f8b9f614ce1b11646a59bf5f07965c688540f35dc7b0658ab7c5ebe633 |
IEDL.DBID | .~1 |
ISSN | 0031-3203 |
IngestDate | Thu Jul 10 23:34:57 EDT 2025 Fri Jul 11 03:19:58 EDT 2025 Wed Apr 02 07:25:04 EDT 2025 Thu Apr 24 23:02:32 EDT 2025 Wed Aug 20 07:45:56 EDT 2025 Fri Feb 23 02:25:26 EST 2024 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 7 |
Keywords | k-Means initialization Balanced clusters Clustering k-Means Performance evaluation Automatic classification Weighting Minimax method Iterative method K means algorithm Robustness Signal classification |
Language | English |
License | CC BY 4.0 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c402t-ed4b1e2f8b9f614ce1b11646a59bf5f07965c688540f35dc7b0658ab7c5ebe633 |
Notes | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23 ObjectType-Article-1 ObjectType-Feature-2 |
PQID | 1541437481 |
PQPubID | 23500 |
PageCount | 12 |
ParticipantIDs | proquest_miscellaneous_1730114147 proquest_miscellaneous_1541437481 pascalfrancis_primary_28417331 crossref_citationtrail_10_1016_j_patcog_2014_01_015 crossref_primary_10_1016_j_patcog_2014_01_015 elsevier_sciencedirect_doi_10_1016_j_patcog_2014_01_015 |
PublicationCentury | 2000 |
PublicationDate | 2014-07-01 |
PublicationDateYYYYMMDD | 2014-07-01 |
PublicationDate_xml | – month: 07 year: 2014 text: 2014-07-01 day: 01 |
PublicationDecade | 2010 |
PublicationPlace | Kidlington |
PublicationPlace_xml | – name: Kidlington |
PublicationTitle | Pattern recognition |
PublicationYear | 2014 |
Publisher | Elsevier Ltd Elsevier |
Publisher_xml | – name: Elsevier Ltd – name: Elsevier |
References | Celebi, Kingravi, Vela (bib7) 2013; 40 P.S. Bradley, U.M. Fayyad, Refining initial points for k-means clustering, in: International Conference on Machine Learning (ICML), 1998, pp. 91–99. H. Zha, X. He, C.H.Q. Ding, M. Gu, H.D. Simon, Spectral relaxation for k-means clustering, in: Advances in Neural Information Processing Systems (NIPS), 2001, pp. 1057–1064. C.-D. Wang, J.-H. Lai, J.-Y. Zhu, A conscience on-line learning approach for kernel-based clustering, in: International Conference on Data Mining (ICDM), 2010, pp. 531–540. Su, Dy (bib12) 2007; 11 Lowe (bib28) 2004; 60 A. Frank, A. Asuncion, UCI machine learning repository, 2010. URL I.S. Dhillon, Y. Guan, B. Kulis, Kernel k-means, spectral clustering and normalized cuts, in: ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2004, pp. 551–556. Schölkopf, Smola, Müller (bib4) 1998; 10 Modha, Spangler (bib22) 2003; 52 F. Nie, C.H.Q. Ding, D. Luo, H. Huang, Improved minmax cut graph clustering with nonnegative relaxation, in: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD), 2010, pp. 451–466. Lloyd (bib3) 1982; 28 Peña, Lozano, Larrañaga (bib6) 1999; 20 Tan, Steinbach, Kumar (bib20) 2005 S.A. Nene, S.K. Nayar, H. Murase, Columbia Object Image Library (COIL-20), Technical Report CUCS-005-96 1996. Likas, Vlassis, Verbeek (bib13) 2003; 36 Huang, Ng, Rong, Li (bib21) 2005; 27 Frey, Dueck (bib30) 2007; 315 C.H.Q. Ding, X. He, H. Zha, M. Gu, H.D. Simon, A min-max cut algorithm for graph partitioning and data clustering, in: International Conference on Data Mining (ICDM), 2001, pp. 107–114. Bagirov, Ugon, Webb (bib15) 2011; 44 A. Kalogeratos, A. Likas, Dip-means: an incremental clustering method for estimating the number of clusters, in: Advances in Neural Information Processing Systems (NIPS), 2012, pp. 2402–2410. Xu, Wunsch II (bib1) 2005; 16 Filippone, Camastra, Masulli, Rovetta (bib2) 2008; 41 Tzortzis, Likas (bib17) 2009; 20 Banerjee, Ghosh (bib10) 2004; 15 Bagirov (bib14) 2008; 41 C.H.Q. Ding, X. He, K-means clustering via principal component analysis, in: International Conference on Machine Learning (ICML), 2004, pp. 225–232. . A. Keller, Fuzzy clustering with outliers, in: International Conference of the North American Fuzzy Information Processing Society (NAFIPS), 2000, pp. 143–147. G. Tzortzis, A. Likas, The global kernel k-means clustering algorithm, in: International Joint Conference on Neural Networks (IJCNN), 2008, pp. 1977–1984. D. Arthur, S. Vassilvitskii, k-Means++: the advantages of careful seeding, in: ACM-SIAM Symposium on Discrete Algorithms (SODA), 2007, pp. 1027–1035. Tan (10.1016/j.patcog.2014.01.015_bib20) 2005 Tzortzis (10.1016/j.patcog.2014.01.015_bib17) 2009; 20 10.1016/j.patcog.2014.01.015_bib19 Modha (10.1016/j.patcog.2014.01.015_bib22) 2003; 52 10.1016/j.patcog.2014.01.015_bib18 Su (10.1016/j.patcog.2014.01.015_bib12) 2007; 11 Bagirov (10.1016/j.patcog.2014.01.015_bib14) 2008; 41 10.1016/j.patcog.2014.01.015_bib23 10.1016/j.patcog.2014.01.015_bib26 10.1016/j.patcog.2014.01.015_bib27 10.1016/j.patcog.2014.01.015_bib24 10.1016/j.patcog.2014.01.015_bib25 Schölkopf (10.1016/j.patcog.2014.01.015_bib4) 1998; 10 Likas (10.1016/j.patcog.2014.01.015_bib13) 2003; 36 Celebi (10.1016/j.patcog.2014.01.015_bib7) 2013; 40 Huang (10.1016/j.patcog.2014.01.015_bib21) 2005; 27 Peña (10.1016/j.patcog.2014.01.015_bib6) 1999; 20 10.1016/j.patcog.2014.01.015_bib8 10.1016/j.patcog.2014.01.015_bib9 10.1016/j.patcog.2014.01.015_bib29 10.1016/j.patcog.2014.01.015_bib5 Xu (10.1016/j.patcog.2014.01.015_bib1) 2005; 16 10.1016/j.patcog.2014.01.015_bib11 Lloyd (10.1016/j.patcog.2014.01.015_bib3) 1982; 28 Banerjee (10.1016/j.patcog.2014.01.015_bib10) 2004; 15 10.1016/j.patcog.2014.01.015_bib16 Frey (10.1016/j.patcog.2014.01.015_bib30) 2007; 315 Filippone (10.1016/j.patcog.2014.01.015_bib2) 2008; 41 Bagirov (10.1016/j.patcog.2014.01.015_bib15) 2011; 44 Lowe (10.1016/j.patcog.2014.01.015_bib28) 2004; 60 |
References_xml | – volume: 36 start-page: 451 year: 2003 end-page: 461 ident: bib13 article-title: The global k-means clustering algorithm publication-title: Pattern Recognit. – reference: H. Zha, X. He, C.H.Q. Ding, M. Gu, H.D. Simon, Spectral relaxation for k-means clustering, in: Advances in Neural Information Processing Systems (NIPS), 2001, pp. 1057–1064. – reference: A. Keller, Fuzzy clustering with outliers, in: International Conference of the North American Fuzzy Information Processing Society (NAFIPS), 2000, pp. 143–147. – volume: 10 start-page: 1299 year: 1998 end-page: 1319 ident: bib4 article-title: Nonlinear component analysis as a kernel eigenvalue problem publication-title: Neural Comput. – year: 2005 ident: bib20 article-title: Introduction to Data Mining – reference: A. Kalogeratos, A. Likas, Dip-means: an incremental clustering method for estimating the number of clusters, in: Advances in Neural Information Processing Systems (NIPS), 2012, pp. 2402–2410. – volume: 28 start-page: 129 year: 1982 end-page: 136 ident: bib3 article-title: Least squares quantization in PCM publication-title: IEEE Trans. Inf. Theory – reference: C.H.Q. Ding, X. He, H. Zha, M. Gu, H.D. Simon, A min-max cut algorithm for graph partitioning and data clustering, in: International Conference on Data Mining (ICDM), 2001, pp. 107–114. – reference: F. Nie, C.H.Q. Ding, D. Luo, H. Huang, Improved minmax cut graph clustering with nonnegative relaxation, in: European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD), 2010, pp. 451–466. – volume: 40 start-page: 200 year: 2013 end-page: 210 ident: bib7 article-title: A comparative study of efficient initialization methods for the k-means clustering algorithm publication-title: Expert Syst. Appl. – volume: 41 start-page: 176 year: 2008 end-page: 190 ident: bib2 article-title: A survey of kernel and spectral methods for clustering publication-title: Pattern Recognit. – volume: 20 start-page: 1181 year: 2009 end-page: 1194 ident: bib17 article-title: The global kernel k-means algorithm for clustering in feature space publication-title: IEEE Trans. Neural Netw. – volume: 15 start-page: 702 year: 2004 end-page: 719 ident: bib10 article-title: Frequency-sensitive competitive learning for scalable balanced clustering on high-dimensional hyperspheres publication-title: IEEE Trans. Neural Netw. – reference: 〉. – reference: A. Frank, A. Asuncion, UCI machine learning repository, 2010. URL 〈 – volume: 20 start-page: 1027 year: 1999 end-page: 1040 ident: bib6 article-title: An empirical comparison of four initialization methods for the k-means algorithm publication-title: Pattern Recognit. Lett. – volume: 11 start-page: 319 year: 2007 end-page: 338 ident: bib12 article-title: In search of deterministic methods for initializing k-means and gaussian mixture clustering publication-title: Intell. Data Anal. – volume: 41 start-page: 3192 year: 2008 end-page: 3199 ident: bib14 article-title: Modified global k-means for minimum sum-of-squares clustering problems publication-title: Pattern Recognit. – volume: 27 start-page: 657 year: 2005 end-page: 668 ident: bib21 article-title: Automated variable weighting in k-means type clustering publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – reference: I.S. Dhillon, Y. Guan, B. Kulis, Kernel k-means, spectral clustering and normalized cuts, in: ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2004, pp. 551–556. – volume: 60 start-page: 91 year: 2004 end-page: 110 ident: bib28 article-title: Distinctive image features from scale-invariant keypoints publication-title: Int. J. Comput. Vis. – volume: 315 start-page: 972 year: 2007 end-page: 976 ident: bib30 article-title: Clustering by passing messages between data points publication-title: Science – reference: C.-D. Wang, J.-H. Lai, J.-Y. Zhu, A conscience on-line learning approach for kernel-based clustering, in: International Conference on Data Mining (ICDM), 2010, pp. 531–540. – reference: D. Arthur, S. Vassilvitskii, k-Means++: the advantages of careful seeding, in: ACM-SIAM Symposium on Discrete Algorithms (SODA), 2007, pp. 1027–1035. – reference: S.A. Nene, S.K. Nayar, H. Murase, Columbia Object Image Library (COIL-20), Technical Report CUCS-005-96 1996. – volume: 44 start-page: 866 year: 2011 end-page: 876 ident: bib15 article-title: Fast modified global k-means algorithm for incremental cluster construction publication-title: Pattern Recognit. – volume: 16 start-page: 645 year: 2005 end-page: 678 ident: bib1 article-title: Survey of clustering algorithms publication-title: IEEE Trans. Neural Netw. – reference: P.S. Bradley, U.M. Fayyad, Refining initial points for k-means clustering, in: International Conference on Machine Learning (ICML), 1998, pp. 91–99. – reference: G. Tzortzis, A. Likas, The global kernel k-means clustering algorithm, in: International Joint Conference on Neural Networks (IJCNN), 2008, pp. 1977–1984. – reference: C.H.Q. Ding, X. He, K-means clustering via principal component analysis, in: International Conference on Machine Learning (ICML), 2004, pp. 225–232. – volume: 52 start-page: 217 year: 2003 end-page: 237 ident: bib22 article-title: Feature weighting in k-means clustering publication-title: Mach. Learn. – volume: 10 start-page: 1299 issue: 5 year: 1998 ident: 10.1016/j.patcog.2014.01.015_bib4 article-title: Nonlinear component analysis as a kernel eigenvalue problem publication-title: Neural Comput. doi: 10.1162/089976698300017467 – volume: 41 start-page: 3192 issue: 10 year: 2008 ident: 10.1016/j.patcog.2014.01.015_bib14 article-title: Modified global k-means for minimum sum-of-squares clustering problems publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2008.04.004 – volume: 52 start-page: 217 issue: 3 year: 2003 ident: 10.1016/j.patcog.2014.01.015_bib22 article-title: Feature weighting in k-means clustering publication-title: Mach. Learn. doi: 10.1023/A:1024016609528 – volume: 40 start-page: 200 issue: 1 year: 2013 ident: 10.1016/j.patcog.2014.01.015_bib7 article-title: A comparative study of efficient initialization methods for the k-means clustering algorithm publication-title: Expert Syst. Appl. doi: 10.1016/j.eswa.2012.07.021 – ident: 10.1016/j.patcog.2014.01.015_bib27 – volume: 28 start-page: 129 issue: 2 year: 1982 ident: 10.1016/j.patcog.2014.01.015_bib3 article-title: Least squares quantization in PCM publication-title: IEEE Trans. Inf. Theory doi: 10.1109/TIT.1982.1056489 – volume: 20 start-page: 1027 issue: 10 year: 1999 ident: 10.1016/j.patcog.2014.01.015_bib6 article-title: An empirical comparison of four initialization methods for the k-means algorithm publication-title: Pattern Recognit. Lett. doi: 10.1016/S0167-8655(99)00069-0 – volume: 60 start-page: 91 issue: 2 year: 2004 ident: 10.1016/j.patcog.2014.01.015_bib28 article-title: Distinctive image features from scale-invariant keypoints publication-title: Int. J. Comput. Vis. doi: 10.1023/B:VISI.0000029664.99615.94 – ident: 10.1016/j.patcog.2014.01.015_bib29 – volume: 36 start-page: 451 issue: 2 year: 2003 ident: 10.1016/j.patcog.2014.01.015_bib13 article-title: The global k-means clustering algorithm publication-title: Pattern Recognit. doi: 10.1016/S0031-3203(02)00060-2 – ident: 10.1016/j.patcog.2014.01.015_bib23 doi: 10.1109/NAFIPS.2000.877408 – ident: 10.1016/j.patcog.2014.01.015_bib18 – year: 2005 ident: 10.1016/j.patcog.2014.01.015_bib20 – ident: 10.1016/j.patcog.2014.01.015_bib5 doi: 10.1145/1014052.1014118 – ident: 10.1016/j.patcog.2014.01.015_bib19 doi: 10.1145/1015330.1015408 – ident: 10.1016/j.patcog.2014.01.015_bib8 – ident: 10.1016/j.patcog.2014.01.015_bib11 – ident: 10.1016/j.patcog.2014.01.015_bib24 doi: 10.1109/ICDM.2001.989507 – volume: 41 start-page: 176 issue: 1 year: 2008 ident: 10.1016/j.patcog.2014.01.015_bib2 article-title: A survey of kernel and spectral methods for clustering publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2007.05.018 – volume: 20 start-page: 1181 issue: 7 year: 2009 ident: 10.1016/j.patcog.2014.01.015_bib17 article-title: The global kernel k-means algorithm for clustering in feature space publication-title: IEEE Trans. Neural Netw. doi: 10.1109/TNN.2009.2019722 – ident: 10.1016/j.patcog.2014.01.015_bib26 – ident: 10.1016/j.patcog.2014.01.015_bib25 doi: 10.1007/978-3-642-15883-4_29 – volume: 27 start-page: 657 issue: 5 year: 2005 ident: 10.1016/j.patcog.2014.01.015_bib21 article-title: Automated variable weighting in k-means type clustering publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2005.95 – volume: 16 start-page: 645 issue: 3 year: 2005 ident: 10.1016/j.patcog.2014.01.015_bib1 article-title: Survey of clustering algorithms publication-title: IEEE Trans. Neural Netw. doi: 10.1109/TNN.2005.845141 – volume: 15 start-page: 702 issue: 3 year: 2004 ident: 10.1016/j.patcog.2014.01.015_bib10 article-title: Frequency-sensitive competitive learning for scalable balanced clustering on high-dimensional hyperspheres publication-title: IEEE Trans. Neural Netw. doi: 10.1109/TNN.2004.824416 – volume: 44 start-page: 866 issue: 4 year: 2011 ident: 10.1016/j.patcog.2014.01.015_bib15 article-title: Fast modified global k-means algorithm for incremental cluster construction publication-title: Pattern Recognit. doi: 10.1016/j.patcog.2010.10.018 – volume: 315 start-page: 972 year: 2007 ident: 10.1016/j.patcog.2014.01.015_bib30 article-title: Clustering by passing messages between data points publication-title: Science doi: 10.1126/science.1136800 – ident: 10.1016/j.patcog.2014.01.015_bib9 doi: 10.1109/ICDM.2010.57 – volume: 11 start-page: 319 issue: 4 year: 2007 ident: 10.1016/j.patcog.2014.01.015_bib12 article-title: In search of deterministic methods for initializing k-means and gaussian mixture clustering publication-title: Intell. Data Anal. doi: 10.3233/IDA-2007-11402 – ident: 10.1016/j.patcog.2014.01.015_bib16 doi: 10.1109/IJCNN.2008.4634069 |
SSID | ssj0017142 |
Score | 2.5413718 |
Snippet | Applying k-Means to minimize the sum of the intra-cluster variances is the most popular clustering approach. However, after a bad initialization, poor local... |
SourceID | proquest pascalfrancis crossref elsevier |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 2505 |
SubjectTerms | Algorithms Applied sciences Balanced clusters Cluster analysis Clustering Clusters Emergence Exact sciences and technology Information, signal and communications theory k-Means k-Means initialization Pattern recognition Robustness Signal and communications theory Signal representation. Spectral analysis Signal, noise Telecommunications and information theory Variance Weighting |
Title | The MinMax k-Means clustering algorithm |
URI | https://dx.doi.org/10.1016/j.patcog.2014.01.015 https://www.proquest.com/docview/1541437481 https://www.proquest.com/docview/1730114147 |
Volume | 47 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3dS8MwEA-iL4L4Lc6PUUHwKa4xTdM-ynBMZT4p7C0kaTqrsytzA5_8273rx0AUB0JfWi5tuFx-d23vfkfIeay1ENpC5Malo4B-IY39JKLagS8M48T3NX7vGDyE_afgbiiGK6Tb1MJgWmWN_RWml2hdX-nU2uwUWYY1vkg7iPTrGNhzLPgNAolWfvm5SPPA_t4VYzhnFKWb8rkyx6sAuJuMMMErKMk7sTnu7-5po9DvoLS06nbxA7hLb9TbJpt1GOldVzPdISsu3yVbTYsGr96xe-QCzMAbZPlAf3ivdODAMXl2PEd2BPBZnh6PJtNs9vy2T556N4_dPq17I1ALb3wz6pLAMHeVRiZOwcNaxwxDqjAtYpOK1JdxKGwYRRCQpVwkVhqMNbSRVsCyhZwfkNV8krtD4gnnc5v4ToZWBgHeKEqYA0nfpAE3rEV4oxJla-Jw7F8xVk2G2IuqFKlQkcpncIgWoYtRRUWcsUReNtpW3wxAAbYvGdn-tjiLx4HnZdiSskXOmtVSsHnwj4jO3WT-rhg2QUcCnr9kSgwEOXn07ykek3U8q9J8T8jqbDp3pxDMzEy7tNY2Wbu-ve8_fAH7JfJO |
linkProvider | Elsevier |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3fT9swED6x9mGT0MYYE2XQZdKkPVnEcxwnj1UFKrTpE0i8WbbjQKGkFbQSfz53-VEJMQ0JKU_JObHOzneX-Px9AL9TY6Q0DjM3oTxD9ItZGuYJMx5jYZzmYWjof0c2jUeX0fmVvNqCYbsXhsoqG-yvMb1C6-bMcePN4-VsRnt8iXaQ6NcpsRfJB-gSO5XsQHdwNh5NN4sJikc1abjgjBq0O-iqMq8lIt7immq8ooq_k_Rx_x2htpfmEf1W1IIXr7C7CkinO_C5ySSDQd3Zr7Dly1340qo0BM1L-w3-4EwIslmZmafgjmUeY1Pg5msiSMCwFZj59eJhtrq534PL05OL4Yg18gjM4Uffivk8stz_LRKbFhhkneeWE1uYkaktZBGqNJYuThLMyQohc6cspRvGKidx5GIhvkOnXJR-HwLpQ-Hy0KvYqSiiGyU592gZ2iISlvdAtC7RruEOJwmLuW6LxG517UhNjtQhx0P2gG1aLWvujDfsVett_WIOaIT3N1r2XwzO5nEYfDmpUvbgVztaGt8fWhQxpV-sHzUnHXTi4PmfTQWDaKcO3t3Fn_BxdJFN9ORsOv4Bn-hKXfV7CJ3Vw9ofYW6zsv1m7j4DTCb0_w |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=The+MinMax+k-Means+clustering+algorithm&rft.jtitle=Pattern+recognition&rft.au=Tzortzis%2C+Grigorios&rft.au=Likas%2C+Aristidis&rft.date=2014-07-01&rft.issn=0031-3203&rft.volume=47&rft.issue=7&rft.spage=2505&rft.epage=2516&rft_id=info:doi/10.1016%2Fj.patcog.2014.01.015&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0031-3203&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0031-3203&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0031-3203&client=summon |