Cost-Sensitive Learning of Deep Feature Representations From Imbalanced Data

Class imbalance is a common problem in the case of real-world object detection and classification tasks. Data of some classes are abundant, making them an overrepresented majority, and data of other classes are scarce, making them an underrepresented minority. This imbalance makes it challenging for...

Full description

Saved in:
Bibliographic Details
Published inIEEE transaction on neural networks and learning systems Vol. 29; no. 8; pp. 3573 - 3587
Main Authors Khan, Salman H., Hayat, Munawar, Bennamoun, Mohammed, Sohel, Ferdous A., Togneri, Roberto
Format Journal Article
LanguageEnglish
Published United States IEEE 01.08.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Class imbalance is a common problem in the case of real-world object detection and classification tasks. Data of some classes are abundant, making them an overrepresented majority, and data of other classes are scarce, making them an underrepresented minority. This imbalance makes it challenging for a classifier to appropriately learn the discriminating boundaries of the majority and minority classes. In this paper, we propose a cost-sensitive (CoSen) deep neural network, which can automatically learn robust feature representations for both the majority and minority classes. During training, our learning procedure jointly optimizes the class-dependent costs and the neural network parameters. The proposed approach is applicable to both binary and multiclass problems without any modification. Moreover, as opposed to data-level approaches, we do not alter the original data distribution, which results in a lower computational cost during the training process. We report the results of our experiments on six major image classification data sets and show that the proposed approach significantly outperforms the baseline algorithms. Comparisons with popular data sampling techniques and CoSen classifiers demonstrate the superior performance of our proposed method.
AbstractList Class imbalance is a common problem in the case of real-world object detection and classification tasks. Data of some classes are abundant, making them an overrepresented majority, and data of other classes are scarce, making them an underrepresented minority. This imbalance makes it challenging for a classifier to appropriately learn the discriminating boundaries of the majority and minority classes. In this paper, we propose a cost-sensitive (CoSen) deep neural network, which can automatically learn robust feature representations for both the majority and minority classes. During training, our learning procedure jointly optimizes the class-dependent costs and the neural network parameters. The proposed approach is applicable to both binary and multiclass problems without any modification. Moreover, as opposed to data-level approaches, we do not alter the original data distribution, which results in a lower computational cost during the training process. We report the results of our experiments on six major image classification data sets and show that the proposed approach significantly outperforms the baseline algorithms. Comparisons with popular data sampling techniques and CoSen classifiers demonstrate the superior performance of our proposed method.Class imbalance is a common problem in the case of real-world object detection and classification tasks. Data of some classes are abundant, making them an overrepresented majority, and data of other classes are scarce, making them an underrepresented minority. This imbalance makes it challenging for a classifier to appropriately learn the discriminating boundaries of the majority and minority classes. In this paper, we propose a cost-sensitive (CoSen) deep neural network, which can automatically learn robust feature representations for both the majority and minority classes. During training, our learning procedure jointly optimizes the class-dependent costs and the neural network parameters. The proposed approach is applicable to both binary and multiclass problems without any modification. Moreover, as opposed to data-level approaches, we do not alter the original data distribution, which results in a lower computational cost during the training process. We report the results of our experiments on six major image classification data sets and show that the proposed approach significantly outperforms the baseline algorithms. Comparisons with popular data sampling techniques and CoSen classifiers demonstrate the superior performance of our proposed method.
Class imbalance is a common problem in the case of real-world object detection and classification tasks. Data of some classes are abundant, making them an overrepresented majority, and data of other classes are scarce, making them an underrepresented minority. This imbalance makes it challenging for a classifier to appropriately learn the discriminating boundaries of the majority and minority classes. In this paper, we propose a cost-sensitive (CoSen) deep neural network, which can automatically learn robust feature representations for both the majority and minority classes. During training, our learning procedure jointly optimizes the class-dependent costs and the neural network parameters. The proposed approach is applicable to both binary and multiclass problems without any modification. Moreover, as opposed to data-level approaches, we do not alter the original data distribution, which results in a lower computational cost during the training process. We report the results of our experiments on six major image classification data sets and show that the proposed approach significantly outperforms the baseline algorithms. Comparisons with popular data sampling techniques and CoSen classifiers demonstrate the superior performance of our proposed method.
Author Hayat, Munawar
Khan, Salman H.
Togneri, Roberto
Bennamoun, Mohammed
Sohel, Ferdous A.
Author_xml – sequence: 1
  givenname: Salman H.
  orcidid: 0000-0002-9502-1749
  surname: Khan
  fullname: Khan, Salman H.
  email: salman.khan@data61.csiro.au
  organization: Data61, Commonwealth Scientific and Industrial Research Organization, Canberra, ACT, Australia
– sequence: 2
  givenname: Munawar
  surname: Hayat
  fullname: Hayat, Munawar
  email: munawar.hayat@canberra.edu.au
  organization: Human-Centered Technology Research Centre, University of Canberra, Canberra, ACT, Australia
– sequence: 3
  givenname: Mohammed
  surname: Bennamoun
  fullname: Bennamoun, Mohammed
  email: mohammed.bennamoun@uwa.edu.au
  organization: School of Computer Science and Software Engineering, The University of Western Australia, Crawley, WA, Australia
– sequence: 4
  givenname: Ferdous A.
  orcidid: 0000-0003-1557-4907
  surname: Sohel
  fullname: Sohel, Ferdous A.
  email: f.sohel@murdoch.edu.au
  organization: School of Engineering and Information Technology, Murdoch University, Perth, WA, Australia
– sequence: 5
  givenname: Roberto
  orcidid: 0000-0002-3778-4633
  surname: Togneri
  fullname: Togneri, Roberto
  email: roberto.togneri@uwa.edu.au
  organization: School of Electrical, Electronic and Computer Engineering, The University of Western Australia, Crawley, WA, Australia
BackLink https://www.ncbi.nlm.nih.gov/pubmed/28829320$$D View this record in MEDLINE/PubMed
BookMark eNp9kcFqGzEQhkVJqJPUL9BCWegll3WkkXYlHYsTt4ElgcaB3IR2PS4KXsmVtIG-fTax64MPncsMw_fPDPOfkxMfPBLymdEZY1RfLe_umocZUCZnIDkIBR_IGbAaSuBKnRxq-TQh05Se6Rg1rWqhP5IJKAWaAz0jzTykXD6gTy67FywatNE7_7sI6-IacVss0OYhYvELtxET-myzCz4Vixj64rZv7cb6DlfFtc32Ezld203C6T5fkMfFzXL-s2zuf9zOvzdlJ5jMZdVx3TJJuZZW1EhpyyyvBaq2rnAtKtBKSyYUr3BsqpbzVYsWlbYM6apj_IJc7uZuY_gzYMqmd6nDzXgKhiEZpjkDwammI_rtCH0OQ_TjdQaoFKoCCvVIfd1TQ9vjymyj6238a_79aQRgB3QxpBRxfUAYNW9-mHc_zJsfZu_HKFJHos7t3pejdZv_S7_spA4RD7sUZVBJzV8BmSuVuA
CODEN ITNNAL
CitedBy_id crossref_primary_10_1016_j_ijdrr_2024_104971
crossref_primary_10_1109_JSEN_2021_3059860
crossref_primary_10_1016_j_future_2019_05_080
crossref_primary_10_1007_s13042_019_00941_6
crossref_primary_10_1145_3594669
crossref_primary_10_3390_s20092448
crossref_primary_10_1016_j_neucom_2023_01_093
crossref_primary_10_1111_exsy_12680
crossref_primary_10_1111_jpim_12676
crossref_primary_10_1109_TBDATA_2024_3352978
crossref_primary_10_1109_TNNLS_2022_3154204
crossref_primary_10_1016_j_neucom_2019_05_013
crossref_primary_10_1109_TGRS_2023_3241366
crossref_primary_10_3390_s19020387
crossref_primary_10_1016_j_imu_2021_100690
crossref_primary_10_1021_acsestwater_1c00037
crossref_primary_10_1016_j_knosys_2022_109535
crossref_primary_10_1109_TIV_2022_3145035
crossref_primary_10_1016_j_joi_2020_101126
crossref_primary_10_1109_JBHI_2022_3162748
crossref_primary_10_1080_01431161_2023_2285742
crossref_primary_10_1016_j_asoc_2025_112864
crossref_primary_10_1007_s11554_019_00849_y
crossref_primary_10_3390_s24041230
crossref_primary_10_1515_phys_2019_0103
crossref_primary_10_1007_s00170_019_04090_6
crossref_primary_10_1109_JSEN_2022_3222535
crossref_primary_10_1111_odi_14474
crossref_primary_10_1109_TEVC_2022_3203862
crossref_primary_10_1109_TAI_2022_3160658
crossref_primary_10_3390_sym16121656
crossref_primary_10_1016_j_dsp_2024_104861
crossref_primary_10_1016_j_jss_2023_111650
crossref_primary_10_1016_j_eswa_2022_118129
crossref_primary_10_1007_s11802_024_5472_9
crossref_primary_10_1016_j_inffus_2022_12_014
crossref_primary_10_1109_ACCESS_2022_3204395
crossref_primary_10_1016_j_media_2024_103281
crossref_primary_10_3390_rs16081398
crossref_primary_10_1016_j_ipm_2020_102388
crossref_primary_10_1016_j_jmsy_2023_10_014
crossref_primary_10_1016_j_neucom_2024_128042
crossref_primary_10_1007_s11263_023_01971_y
crossref_primary_10_1016_j_aej_2022_07_062
crossref_primary_10_1071_CP21626
crossref_primary_10_1109_TSM_2020_3010984
crossref_primary_10_1007_s11063_021_10483_0
crossref_primary_10_3390_math10142486
crossref_primary_10_1109_TAI_2023_3275133
crossref_primary_10_1109_TETCI_2023_3327355
crossref_primary_10_1145_3579050
crossref_primary_10_3390_math13030368
crossref_primary_10_1016_j_ijmedinf_2024_105666
crossref_primary_10_1016_j_neucom_2023_01_063
crossref_primary_10_1007_s11227_022_05037_7
crossref_primary_10_1016_j_solener_2021_05_095
crossref_primary_10_1016_j_apacoust_2020_107740
crossref_primary_10_9758_cpn_2022_20_4_609
crossref_primary_10_1016_j_chaos_2021_111110
crossref_primary_10_1016_j_compag_2021_106067
crossref_primary_10_1016_j_jag_2021_102510
crossref_primary_10_3390_app122211662
crossref_primary_10_3390_electronics10212700
crossref_primary_10_1016_j_eswa_2023_120303
crossref_primary_10_1016_j_neucom_2025_129830
crossref_primary_10_1109_JIOT_2024_3404808
crossref_primary_10_1016_j_engappai_2022_105080
crossref_primary_10_1049_cit2_12311
crossref_primary_10_1109_TPAMI_2021_3069908
crossref_primary_10_1016_j_neucom_2020_01_101
crossref_primary_10_1109_TCYB_2021_3059631
crossref_primary_10_1016_j_jnca_2020_102766
crossref_primary_10_3390_rs12203301
crossref_primary_10_1007_s10489_022_03200_4
crossref_primary_10_1109_TKDE_2019_2905559
crossref_primary_10_1080_00207543_2021_1891318
crossref_primary_10_1088_1361_6501_adb872
crossref_primary_10_1007_s11042_024_19303_8
crossref_primary_10_1007_s11424_021_1038_8
crossref_primary_10_1016_j_engappai_2023_107639
crossref_primary_10_1039_D4DD00244J
crossref_primary_10_1016_j_knosys_2023_110831
crossref_primary_10_1016_j_neucom_2023_126735
crossref_primary_10_1016_j_isatra_2019_11_020
crossref_primary_10_1109_ACCESS_2018_2879221
crossref_primary_10_1109_TNNLS_2021_3051721
crossref_primary_10_1049_iet_gtd_2019_1562
crossref_primary_10_1016_j_neucom_2021_04_010
crossref_primary_10_1016_j_neunet_2021_07_003
crossref_primary_10_1109_TCSS_2023_3302651
crossref_primary_10_7717_peerj_cs_1975
crossref_primary_10_1016_j_image_2023_117074
crossref_primary_10_1109_TDSC_2022_3183170
crossref_primary_10_1016_j_eswa_2023_120760
crossref_primary_10_1016_j_ins_2023_119541
crossref_primary_10_1109_TMM_2023_3295090
crossref_primary_10_1186_s13677_022_00361_y
crossref_primary_10_1007_s10115_024_02279_0
crossref_primary_10_1016_j_cie_2023_109410
crossref_primary_10_1109_TNNLS_2023_3284430
crossref_primary_10_1016_j_media_2019_04_009
crossref_primary_10_1109_JBHI_2023_3325540
crossref_primary_10_1088_1361_6501_ad24b5
crossref_primary_10_1109_JBHI_2023_3279824
crossref_primary_10_1109_TIE_2020_3003622
crossref_primary_10_1016_j_ins_2019_02_062
crossref_primary_10_1049_stg2_12044
crossref_primary_10_1016_j_measurement_2023_113014
crossref_primary_10_1109_TASE_2019_2936645
crossref_primary_10_1016_j_knosys_2022_108296
crossref_primary_10_1088_1742_6596_1916_1_012031
crossref_primary_10_1155_2022_8733632
crossref_primary_10_1080_24725854_2021_2018528
crossref_primary_10_1109_TCSVT_2023_3311142
crossref_primary_10_3390_molecules27207085
crossref_primary_10_1109_TNNLS_2019_2913673
crossref_primary_10_1016_j_neunet_2024_106134
crossref_primary_10_1109_TIM_2021_3063755
crossref_primary_10_1186_s13007_020_00620_6
crossref_primary_10_1007_s11263_021_01434_2
crossref_primary_10_1109_TNNLS_2019_2948881
crossref_primary_10_1002_tee_22715
crossref_primary_10_3390_app10020505
crossref_primary_10_1016_j_patcog_2022_109158
crossref_primary_10_1007_s13042_022_01575_x
crossref_primary_10_1016_j_patrec_2023_05_035
crossref_primary_10_1109_TPAMI_2020_3041332
crossref_primary_10_1109_TNNLS_2024_3383672
crossref_primary_10_1109_TPAMI_2022_3196044
crossref_primary_10_1016_j_patcog_2020_107417
crossref_primary_10_1109_ACCESS_2021_3100057
crossref_primary_10_1007_s13042_022_01662_z
crossref_primary_10_1109_TIP_2021_3049955
crossref_primary_10_1002_cyto_a_24829
crossref_primary_10_1109_TNNLS_2023_3286484
crossref_primary_10_1016_j_neucom_2022_03_042
crossref_primary_10_1109_TCYB_2019_2913572
crossref_primary_10_1109_TNSE_2021_3100322
crossref_primary_10_1016_j_ins_2023_118935
crossref_primary_10_1016_j_ins_2019_09_032
crossref_primary_10_1016_j_future_2022_01_026
crossref_primary_10_3390_ijgi13100364
crossref_primary_10_1049_ipr2_12109
crossref_primary_10_1109_TETCI_2023_3251400
crossref_primary_10_1007_s10916_023_01938_z
crossref_primary_10_1088_1742_6596_2816_1_012020
crossref_primary_10_1007_s00521_023_09180_x
crossref_primary_10_1109_TNNLS_2022_3231917
crossref_primary_10_1109_ACCESS_2020_3033531
crossref_primary_10_1109_TIP_2023_3270103
crossref_primary_10_1109_TAFFC_2024_3405584
crossref_primary_10_1016_j_knosys_2021_106742
crossref_primary_10_1016_j_aej_2024_10_084
crossref_primary_10_1016_j_ins_2024_121280
crossref_primary_10_1109_LGRS_2021_3083262
crossref_primary_10_1016_j_atech_2022_100091
crossref_primary_10_1088_1742_6596_1267_1_012012
crossref_primary_10_1016_j_eswa_2024_126176
crossref_primary_10_3389_fpls_2025_1552553
crossref_primary_10_1007_s11063_020_10366_w
crossref_primary_10_3233_IDA_215735
crossref_primary_10_1109_TCSVT_2024_3487867
crossref_primary_10_3390_app9224829
crossref_primary_10_1016_j_knosys_2020_106631
crossref_primary_10_1109_ACCESS_2022_3233411
crossref_primary_10_1109_TIP_2019_2940533
crossref_primary_10_1109_ACCESS_2023_3341755
crossref_primary_10_1109_TNNLS_2022_3175068
crossref_primary_10_3390_s20205786
crossref_primary_10_1016_j_jmsy_2022_06_011
crossref_primary_10_1016_j_bbe_2019_05_008
crossref_primary_10_3390_rs14143295
crossref_primary_10_1109_TSMC_2022_3151394
crossref_primary_10_1109_TNNLS_2018_2876865
crossref_primary_10_1109_ACCESS_2021_3104340
crossref_primary_10_1016_j_newast_2020_101561
crossref_primary_10_1109_ACCESS_2023_3308998
crossref_primary_10_1016_j_rse_2021_112603
crossref_primary_10_3389_fgene_2019_00351
crossref_primary_10_1007_s13042_021_01321_9
crossref_primary_10_1109_JSTARS_2020_2995703
crossref_primary_10_1016_j_compbiomed_2022_105339
crossref_primary_10_1016_j_aei_2019_100935
crossref_primary_10_1016_j_neunet_2018_09_009
crossref_primary_10_1016_j_patrec_2019_07_006
crossref_primary_10_1080_07038992_2021_1910499
crossref_primary_10_3390_s23052610
crossref_primary_10_1007_s11042_022_13424_8
crossref_primary_10_1016_j_inffus_2023_101845
crossref_primary_10_1109_TNNLS_2021_3071122
crossref_primary_10_3390_app131810182
crossref_primary_10_1109_ACCESS_2019_2924060
crossref_primary_10_1155_2022_7839840
crossref_primary_10_1109_TNNLS_2020_3047335
crossref_primary_10_1109_TNNLS_2023_3269907
crossref_primary_10_1109_TGRS_2021_3071559
crossref_primary_10_1007_s10278_022_00618_7
crossref_primary_10_3390_pr10101938
crossref_primary_10_1007_s13278_022_00952_2
crossref_primary_10_1109_TCYB_2020_3016972
crossref_primary_10_23919_CHAIN_2024_100006
crossref_primary_10_1007_s00521_023_08363_w
crossref_primary_10_1016_j_dsp_2021_103212
crossref_primary_10_1109_ACCESS_2023_3260723
crossref_primary_10_1007_s12206_023_0501_y
crossref_primary_10_1109_TCSVT_2024_3383962
crossref_primary_10_1016_j_envpol_2021_118153
crossref_primary_10_1007_s11227_022_04965_8
crossref_primary_10_1016_j_knosys_2021_106925
crossref_primary_10_1109_ACCESS_2025_3531662
crossref_primary_10_1109_TCDS_2022_3175360
crossref_primary_10_3390_bioengineering10040420
crossref_primary_10_1016_j_eswax_2019_100003
crossref_primary_10_1109_JSEN_2021_3057076
crossref_primary_10_1016_j_asoc_2020_106989
crossref_primary_10_1007_s10489_020_02089_1
crossref_primary_10_1108_GS_03_2021_0041
crossref_primary_10_1016_j_artmed_2024_102801
crossref_primary_10_1016_j_bspc_2023_104962
crossref_primary_10_1016_j_eswa_2021_115673
crossref_primary_10_1186_s12911_019_0899_4
crossref_primary_10_1016_j_neucom_2023_126577
crossref_primary_10_1109_TVT_2021_3080678
crossref_primary_10_1109_ACCESS_2020_2975630
crossref_primary_10_18267_j_aip_254
crossref_primary_10_1007_s10209_024_01123_0
crossref_primary_10_1016_j_surfin_2024_104334
crossref_primary_10_3390_bioengineering9080350
crossref_primary_10_1007_s41060_024_00625_7
crossref_primary_10_1109_TNSM_2021_3112283
crossref_primary_10_1109_TAI_2024_3478191
crossref_primary_10_1007_s00521_020_05256_0
crossref_primary_10_3390_s22114075
crossref_primary_10_1109_TSM_2020_2994357
crossref_primary_10_1186_s12859_019_3269_4
crossref_primary_10_3390_app10041276
crossref_primary_10_1016_j_neucom_2021_07_040
crossref_primary_10_1016_j_inffus_2025_102982
crossref_primary_10_1016_j_infsof_2021_106742
crossref_primary_10_1080_01431161_2019_1582113
crossref_primary_10_1049_cvi2_12027
crossref_primary_10_1016_j_cviu_2022_103582
crossref_primary_10_1016_j_eswa_2024_125906
crossref_primary_10_3390_agriculture12020259
crossref_primary_10_1109_TCYB_2020_3001158
crossref_primary_10_1016_j_isprsjprs_2024_08_018
crossref_primary_10_1002_int_22678
crossref_primary_10_1016_j_knosys_2021_107649
crossref_primary_10_1109_TNNLS_2019_2917524
crossref_primary_10_3390_app11083331
crossref_primary_10_1016_j_iswa_2023_200316
crossref_primary_10_1002_qre_2983
crossref_primary_10_1109_ACCESS_2020_2991237
crossref_primary_10_1109_ACCESS_2023_3294099
crossref_primary_10_1186_s40537_019_0225_0
crossref_primary_10_1016_j_neucom_2020_12_122
crossref_primary_10_1109_JBHI_2020_3006145
crossref_primary_10_1038_s41598_018_28244_w
crossref_primary_10_1109_TAI_2024_3401102
crossref_primary_10_1109_ACCESS_2019_2933165
crossref_primary_10_3390_en15218059
crossref_primary_10_1186_s40537_019_0192_5
crossref_primary_10_1007_s13042_022_01677_6
crossref_primary_10_1051_0004_6361_202142751
crossref_primary_10_1016_j_neunet_2023_01_015
crossref_primary_10_1016_j_engappai_2023_106911
crossref_primary_10_1007_s00138_023_01480_5
crossref_primary_10_35378_gujs_854725
crossref_primary_10_1007_s10845_019_01522_8
crossref_primary_10_1109_TKDE_2021_3061428
crossref_primary_10_1016_j_eswa_2019_04_005
crossref_primary_10_2139_ssrn_4115383
crossref_primary_10_1016_j_dss_2022_113765
crossref_primary_10_1016_j_neunet_2021_04_013
crossref_primary_10_1016_j_procir_2020_04_106
crossref_primary_10_1155_2022_4343645
crossref_primary_10_1016_j_ins_2024_120351
crossref_primary_10_1016_j_neucom_2024_128853
crossref_primary_10_1109_TNNLS_2021_3136503
crossref_primary_10_1002_cjce_24610
crossref_primary_10_5861_ijrse_2021_648
crossref_primary_10_1016_j_knosys_2024_111500
crossref_primary_10_1016_j_neucom_2022_02_077
crossref_primary_10_3390_app10155293
crossref_primary_10_1287_ijoc_2023_1274
crossref_primary_10_1049_ipr2_12410
crossref_primary_10_1007_s00521_020_05529_8
crossref_primary_10_1186_s13059_022_02739_2
crossref_primary_10_1016_j_knosys_2020_106598
crossref_primary_10_1016_j_eswa_2021_115067
crossref_primary_10_1109_TKDE_2020_2974949
crossref_primary_10_1109_TGRS_2022_3177853
crossref_primary_10_1016_j_pacfin_2023_101948
crossref_primary_10_1109_TGRS_2024_3390764
crossref_primary_10_1109_TNNLS_2021_3110885
crossref_primary_10_1109_TPAMI_2019_2929166
crossref_primary_10_1016_j_engappai_2022_105621
crossref_primary_10_1016_j_eswa_2023_121084
crossref_primary_10_1016_j_infsof_2021_106662
crossref_primary_10_1007_s10489_022_04187_8
crossref_primary_10_1109_JSEN_2024_3466895
crossref_primary_10_1016_j_knosys_2020_106223
crossref_primary_10_1109_TNNLS_2019_2927647
crossref_primary_10_1109_TIM_2022_3227995
crossref_primary_10_1016_j_patcog_2022_108947
crossref_primary_10_1109_ACCESS_2021_3109989
crossref_primary_10_1007_s10994_021_06012_8
crossref_primary_10_1016_j_eswa_2021_116459
crossref_primary_10_1080_00224065_2024_2394604
crossref_primary_10_1007_s00500_021_06532_4
crossref_primary_10_3390_e24070871
crossref_primary_10_1109_ACCESS_2021_3052680
crossref_primary_10_1109_TCYB_2022_3163811
crossref_primary_10_1016_j_patrec_2020_02_007
crossref_primary_10_1007_s10618_024_01008_z
crossref_primary_10_1016_j_cie_2022_108936
crossref_primary_10_1016_j_apenergy_2022_120573
crossref_primary_10_3390_math10060934
crossref_primary_10_1016_j_patcog_2020_107382
crossref_primary_10_1109_TNNLS_2021_3106484
crossref_primary_10_1016_j_engappai_2020_103878
crossref_primary_10_1186_s12911_019_0929_2
crossref_primary_10_1016_j_engappai_2024_109345
crossref_primary_10_1016_j_ins_2022_11_108
crossref_primary_10_1109_ACCESS_2023_3239889
crossref_primary_10_1007_s10845_022_02067_z
crossref_primary_10_1038_s41598_024_73428_2
crossref_primary_10_1016_j_ins_2021_09_059
crossref_primary_10_1109_JSEN_2022_3211021
crossref_primary_10_1142_S2196888821500135
crossref_primary_10_1007_s42421_023_00067_w
crossref_primary_10_3103_S1060992X21010100
crossref_primary_10_1007_s11263_024_02104_9
crossref_primary_10_1007_s11227_022_04851_3
crossref_primary_10_1587_transinf_2021HCK0001
crossref_primary_10_1007_s10845_021_01907_8
crossref_primary_10_1109_ACCESS_2022_3161510
crossref_primary_10_1007_s10044_020_00890_9
crossref_primary_10_1016_j_ins_2021_11_058
crossref_primary_10_1007_s42979_024_03570_1
crossref_primary_10_1016_j_knosys_2020_106368
crossref_primary_10_1109_TNNLS_2021_3052243
crossref_primary_10_1007_s00521_021_06066_8
crossref_primary_10_1016_j_tust_2022_104399
crossref_primary_10_3389_fgene_2020_608512
crossref_primary_10_1109_TGRS_2021_3068447
crossref_primary_10_3233_IDA_184354
crossref_primary_10_1109_TCSVT_2021_3122110
crossref_primary_10_1155_2021_5577636
crossref_primary_10_1142_S2196888821500147
crossref_primary_10_1016_j_neucom_2024_128530
crossref_primary_10_1007_s10462_020_09820_x
crossref_primary_10_1088_2057_1976_ac7ad9
crossref_primary_10_1109_ACCESS_2020_2985097
crossref_primary_10_1016_j_cie_2024_110674
crossref_primary_10_1007_s10489_019_01624_z
crossref_primary_10_1016_j_neucom_2019_12_057
crossref_primary_10_1007_s10479_025_06528_5
crossref_primary_10_1109_TNNLS_2019_2920887
crossref_primary_10_1109_TCYB_2022_3173356
crossref_primary_10_1016_j_neunet_2024_106932
crossref_primary_10_1109_TCSS_2020_2970805
crossref_primary_10_1186_s40708_018_0080_3
crossref_primary_10_1155_2019_6943234
crossref_primary_10_1007_s12652_023_04610_z
crossref_primary_10_1007_s40123_023_00841_7
crossref_primary_10_1016_j_rse_2024_114274
crossref_primary_10_1109_ACCESS_2020_2964281
crossref_primary_10_1109_TNNLS_2019_2929575
crossref_primary_10_1016_j_compag_2023_108043
crossref_primary_10_1109_TCYB_2021_3126756
crossref_primary_10_3390_computers11050073
crossref_primary_10_1016_j_ins_2019_10_017
crossref_primary_10_1016_j_eswa_2023_122088
crossref_primary_10_3390_electronics14020280
crossref_primary_10_1016_j_jclepro_2020_122864
crossref_primary_10_1016_j_ijepes_2021_107574
crossref_primary_10_1016_j_knosys_2022_108966
crossref_primary_10_1371_journal_pone_0274522
crossref_primary_10_1016_j_ins_2020_05_141
crossref_primary_10_1109_ACCESS_2020_3007801
crossref_primary_10_1007_s11063_021_10679_4
crossref_primary_10_1109_TCYB_2021_3103885
crossref_primary_10_3390_s20020447
crossref_primary_10_1109_ACCESS_2021_3072623
crossref_primary_10_1093_jcde_qwae075
crossref_primary_10_1007_s00530_024_01317_9
crossref_primary_10_1016_j_patcog_2021_108114
crossref_primary_10_1007_s10462_023_10652_8
crossref_primary_10_1109_ACCESS_2024_3442569
crossref_primary_10_3390_sym13010004
crossref_primary_10_1088_1361_6501_abea3f
crossref_primary_10_1007_s13198_023_01897_1
crossref_primary_10_1007_s11227_022_04509_0
crossref_primary_10_1109_TCSVT_2023_3321733
crossref_primary_10_3390_electronics11091510
crossref_primary_10_1016_j_compbiomed_2022_106092
crossref_primary_10_1109_TPAMI_2019_2914680
crossref_primary_10_1016_j_cie_2021_107630
crossref_primary_10_1016_j_eswa_2023_122192
crossref_primary_10_1007_s11263_024_02081_z
crossref_primary_10_1109_ACCESS_2020_3047019
crossref_primary_10_1142_S0219519422400528
crossref_primary_10_1109_TCSVT_2022_3161427
crossref_primary_10_1109_JBHI_2024_3376428
crossref_primary_10_1007_s10994_022_06268_8
crossref_primary_10_1109_TNNLS_2021_3105104
crossref_primary_10_3390_s20174810
crossref_primary_10_1007_s13042_022_01746_w
crossref_primary_10_1007_s13042_023_02048_5
crossref_primary_10_1016_j_patcog_2021_108467
crossref_primary_10_1016_j_aei_2020_101131
crossref_primary_10_1109_TGRS_2022_3211847
crossref_primary_10_1016_j_knosys_2022_108816
crossref_primary_10_1109_TIP_2023_3321461
crossref_primary_10_1109_ACCESS_2020_3016653
crossref_primary_10_1109_TCSS_2020_3017818
crossref_primary_10_1145_3689627
crossref_primary_10_1016_j_asoc_2021_107783
crossref_primary_10_1145_3584360
crossref_primary_10_1007_s11227_025_06920_9
crossref_primary_10_1145_3502287
crossref_primary_10_1016_j_eswa_2024_124229
crossref_primary_10_1109_TII_2022_3190034
crossref_primary_10_1109_TNNLS_2023_3321753
crossref_primary_10_1145_3467477
crossref_primary_10_2196_16678
crossref_primary_10_1007_s40747_023_01225_x
crossref_primary_10_1016_j_neucom_2024_129318
crossref_primary_10_1049_cit2_12032
crossref_primary_10_1109_TKDE_2019_2951556
crossref_primary_10_1109_TNNLS_2021_3106306
crossref_primary_10_3390_rs16183494
crossref_primary_10_1007_s42979_020_00211_1
crossref_primary_10_1007_s42979_024_02678_8
crossref_primary_10_1016_j_eswa_2024_125794
crossref_primary_10_1016_j_eswa_2024_123495
crossref_primary_10_3390_s22062293
crossref_primary_10_1038_s41598_023_42689_8
crossref_primary_10_1145_3630256
crossref_primary_10_1016_j_cose_2022_102861
crossref_primary_10_1109_ACCESS_2019_2953085
crossref_primary_10_1007_s11432_021_3319_4
crossref_primary_10_3390_jsan10040072
crossref_primary_10_2139_ssrn_4666141
crossref_primary_10_1016_j_ins_2023_01_147
crossref_primary_10_1007_s11063_021_10534_6
crossref_primary_10_1016_j_asoc_2022_108855
crossref_primary_10_1016_j_ins_2023_02_064
crossref_primary_10_1016_j_engappai_2024_108523
crossref_primary_10_1016_j_neucom_2021_06_012
crossref_primary_10_1097_RCT_0000000000001497
crossref_primary_10_3390_diagnostics11081393
crossref_primary_10_1097_MAT_0000000000002299
crossref_primary_10_1016_j_compag_2022_107091
crossref_primary_10_1109_TPAMI_2023_3275585
crossref_primary_10_1016_j_patrec_2020_05_020
crossref_primary_10_1016_j_psep_2024_10_043
crossref_primary_10_1049_iet_ipr_2019_0255
crossref_primary_10_1186_s12911_022_02075_2
crossref_primary_10_1109_JIOT_2023_3342638
crossref_primary_10_3390_s22010310
crossref_primary_10_1049_ipr2_12942
crossref_primary_10_1016_j_cmpb_2022_106628
crossref_primary_10_3390_pr9091678
crossref_primary_10_1371_journal_pone_0252612
crossref_primary_10_1016_j_patcog_2022_108564
crossref_primary_10_1038_s41540_024_00415_8
crossref_primary_10_1016_j_neucom_2023_01_023
crossref_primary_10_1016_j_dss_2021_113544
crossref_primary_10_3390_s21175818
crossref_primary_10_48123_rsgis_1410250
crossref_primary_10_1007_s10489_020_01637_z
crossref_primary_10_3390_s21196616
crossref_primary_10_1016_j_ress_2023_109832
crossref_primary_10_1109_JSTARS_2024_3424498
crossref_primary_10_1007_s00500_024_09931_5
crossref_primary_10_1007_s00521_022_07167_8
crossref_primary_10_3390_electronics11091322
Cites_doi 10.1145/1961189.1961199
10.1109/TFUZZ.2013.2296091
10.1109/CVPR.2013.124
10.1109/ISBI.2012.6235558
10.1109/CVPR.2012.6247798
10.1109/CVPR.2013.115
10.1109/CVPR.2014.249
10.1109/TKDE.2006.17
10.1109/TGRS.2017.2707528
10.1007/s10115-009-0198-y
10.1109/CVPR.2010.5540018
10.1016/j.asoc.2013.09.014
10.1007/s00521-014-1584-2
10.1007/978-94-007-5389-1_4
10.1109/TPAMI.2015.2462355
10.1007/978-3-319-46182-3_13
10.1109/IJCNN.2016.7727770
10.1007/978-3-319-10584-0_26
10.1155/2013/196256
10.1109/ICCV.2009.5459183
10.1109/CVPRW.2014.131
10.1198/016214505000000907
10.1109/TIP.2016.2567076
10.1007/978-3-642-17534-3_19
10.1145/1007730.1007735
10.1007/978-3-319-10578-9_23
10.1109/CIDM.2011.5949434
10.1109/CVPR.2014.81
10.1007/s10115-011-0465-6
10.5244/C.25.76
10.1613/jair.953
10.1016/j.neucom.2013.05.051
10.1145/1014052.1014056
10.1007/s11263-015-0843-8
10.1007/11538059_91
10.1109/CVPR.2014.476
10.1109/ICIP.2016.7532411
10.1109/TKDE.2005.95
10.1109/TKDE.2008.239
10.1007/978-3-642-15561-1_11
10.1109/TSMCB.2008.2002909
10.1007/978-3-642-01307-2_43
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
DBID 97E
RIA
RIE
AAYXX
CITATION
NPM
7QF
7QO
7QP
7QQ
7QR
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
DOI 10.1109/TNNLS.2017.2732482
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
PubMed
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Calcium & Calcified Tissue Abstracts
Ceramic Abstracts
Chemoreception Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Neurosciences Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
Materials Research Database
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Materials Business File
Aerospace Database
Engineered Materials Abstracts
Biotechnology Research Abstracts
Chemoreception Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
Civil Engineering Abstracts
Aluminium Industry Abstracts
Electronics & Communications Abstracts
Ceramic Abstracts
Neurosciences Abstracts
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Solid State and Superconductivity Abstracts
Engineering Research Database
Calcium & Calcified Tissue Abstracts
Corrosion Abstracts
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic
PubMed

Materials Research Database
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 2162-2388
EndPage 3587
ExternalDocumentID 28829320
10_1109_TNNLS_2017_2732482
8012579
Genre orig-research
Research Support, Non-U.S. Gov't
Journal Article
GrantInformation_xml – fundername: University of Western Australia
  funderid: 10.13039/501100001801
– fundername: Australian Research Council
  grantid: DP150100294; DE120102960
  funderid: 10.13039/501100000923
GroupedDBID 0R~
4.4
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACIWK
ACPRK
AENEX
AFRAH
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
IFIPE
IPLJI
JAVBF
M43
MS~
O9-
OCL
PQQKQ
RIA
RIE
RNS
AAYXX
CITATION
RIG
NPM
7QF
7QO
7QP
7QQ
7QR
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
ID FETCH-LOGICAL-c417t-5c39b170397a46e00b1a364e8b65ef452989714835e4e88b33dbeae89a1e0dc13
IEDL.DBID RIE
ISSN 2162-237X
2162-2388
IngestDate Fri Jul 11 07:31:25 EDT 2025
Mon Jun 30 03:34:47 EDT 2025
Mon Jul 21 06:07:22 EDT 2025
Tue Jul 01 00:27:25 EDT 2025
Thu Apr 24 23:03:49 EDT 2025
Wed Aug 27 02:05:27 EDT 2025
IsPeerReviewed false
IsScholarly true
Issue 8
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c417t-5c39b170397a46e00b1a364e8b65ef452989714835e4e88b33dbeae89a1e0dc13
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0002-9502-1749
0000-0003-1557-4907
0000-0002-3778-4633
PMID 28829320
PQID 2074852026
PQPubID 85436
PageCount 15
ParticipantIDs proquest_miscellaneous_1931243090
pubmed_primary_28829320
crossref_primary_10_1109_TNNLS_2017_2732482
crossref_citationtrail_10_1109_TNNLS_2017_2732482
ieee_primary_8012579
proquest_journals_2074852026
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2018-08-01
PublicationDateYYYYMMDD 2018-08-01
PublicationDate_xml – month: 08
  year: 2018
  text: 2018-08-01
  day: 01
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: Piscataway
PublicationTitle IEEE transaction on neural networks and learning systems
PublicationTitleAbbrev TNNLS
PublicationTitleAlternate IEEE Trans Neural Netw Learn Syst
PublicationYear 2018
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref57
ref13
ref56
ref12
ref59
ref58
ref14
ref55
ref11
ref54
ref10
chen (ref64) 2004
ref19
beijbom (ref38) 2014
srivastava (ref17) 2014; 15
zeiler (ref53) 2014
ref50
chung (ref34) 2015
ref48
espíndola (ref40) 2005; 35
ref42
ref41
donahue (ref52) 2014
nielsen (ref39) 2014; 1
ref49
yang (ref51) 2009
ref8
ref9
ref4
ref3
chang (ref45) 2015
lin (ref5) 2013
ref35
ref37
ref36
garcía (ref18) 2007
huang (ref32) 2004; 2
ref31
lin (ref47) 2014
ref30
chatfield (ref6) 2014
ref2
ref1
krizhevsky (ref16) 2012
lee (ref44) 2016
ref24
ref67
ref23
ref26
ref25
ref20
doersch (ref60) 2013
ref63
ref66
ref22
ref65
ref21
mani (ref62) 2003
kukar (ref33) 1998
ref28
ref27
wu (ref29) 2003
simonyan (ref43) 2014
lee (ref7) 2015
ref61
van der maaten (ref15) 2013
springenberg (ref46) 2014
References_xml – ident: ref66
  doi: 10.1145/1961189.1961199
– ident: ref25
  doi: 10.1109/TFUZZ.2013.2296091
– ident: ref59
  doi: 10.1109/CVPR.2013.124
– ident: ref42
  doi: 10.1109/ISBI.2012.6235558
– year: 2004
  ident: ref64
  article-title: Using random forest to learn imbalanced data
– ident: ref8
  doi: 10.1109/CVPR.2012.6247798
– ident: ref57
  doi: 10.1109/CVPR.2013.115
– start-page: 494
  year: 2013
  ident: ref60
  article-title: Mid-level visual element discovery as discriminative mode seeking
  publication-title: Proc NIPS
– start-page: 1939
  year: 2007
  ident: ref18
  article-title: The class imbalance problem in pattern classification and learning
  publication-title: Proc Congreso Español Inf
– start-page: 1
  year: 2003
  ident: ref62
  article-title: KNN approach to unbalanced data distributions: A case study involving information extraction
  publication-title: Proc WLID
– ident: ref11
  doi: 10.1109/CVPR.2014.249
– ident: ref14
  doi: 10.1109/TKDE.2006.17
– year: 2014
  ident: ref43
  publication-title: Very Deep Convolutional Networks for Large-scale Image Recognition
– start-page: 1323
  year: 2014
  ident: ref47
  article-title: Stable and efficient representation learning with nonnegativity constraints
  publication-title: Proc ICML
– year: 2015
  ident: ref45
  publication-title: Batch-normalized Maxout Network in Network
– ident: ref4
  doi: 10.1109/TGRS.2017.2707528
– ident: ref28
  doi: 10.1007/s10115-009-0198-y
– year: 2015
  ident: ref34
  publication-title: Cost-aware pretraining for multiclass cost-sensitive deep learning
– ident: ref49
  doi: 10.1109/CVPR.2010.5540018
– ident: ref19
  doi: 10.1016/j.asoc.2013.09.014
– start-page: 586
  year: 2014
  ident: ref38
  article-title: Guess-averse loss functions for cost-sensitive multiclass boosting
  publication-title: Proc Int Conf Mach Learn (ICML)
– start-page: 49
  year: 2003
  ident: ref29
  article-title: Class-boundary alignment for imbalanced dataset learning
  publication-title: Proc ICML Workshop
– ident: ref26
  doi: 10.1007/s00521-014-1584-2
– ident: ref41
  doi: 10.1007/978-94-007-5389-1_4
– ident: ref2
  doi: 10.1109/TPAMI.2015.2462355
– ident: ref36
  doi: 10.1007/978-3-319-46182-3_13
– ident: ref35
  doi: 10.1109/IJCNN.2016.7727770
– ident: ref61
  doi: 10.1007/978-3-319-10584-0_26
– start-page: 410
  year: 2013
  ident: ref15
  article-title: Learning with marginalized corrupted features
  publication-title: Proc 30th Int Conf Mach Learn (ICMLll)
– year: 2014
  ident: ref6
  publication-title: Return of the devil in the details Delving deep into convolutional nets
– ident: ref12
  doi: 10.1155/2013/196256
– ident: ref48
  doi: 10.1109/ICCV.2009.5459183
– ident: ref58
  doi: 10.1109/CVPRW.2014.131
– ident: ref37
  doi: 10.1198/016214505000000907
– year: 2013
  ident: ref5
  publication-title: Network in Network
– ident: ref1
  doi: 10.1109/TIP.2016.2567076
– ident: ref24
  doi: 10.1007/978-3-642-17534-3_19
– ident: ref23
  doi: 10.1145/1007730.1007735
– ident: ref54
  doi: 10.1007/978-3-319-10578-9_23
– ident: ref22
  doi: 10.1109/CIDM.2011.5949434
– ident: ref31
  doi: 10.1109/CVPR.2014.81
– ident: ref13
  doi: 10.1007/s10115-011-0465-6
– start-page: 562
  year: 2015
  ident: ref7
  article-title: Deeply-supervised nets
  publication-title: Proc Artif Intell Statist
– start-page: 1097
  year: 2012
  ident: ref16
  article-title: Imagenet classification with deep convolutional neural networks
  publication-title: Proc NIPS
– start-page: 818
  year: 2014
  ident: ref53
  article-title: Visualizing and understanding convolutional networks
  publication-title: Proc ECCV
– year: 2014
  ident: ref46
  article-title: Improving deep neural networks with probabilistic maxout units
  publication-title: Proc ICLRs
– volume: 35
  year: 2005
  ident: ref40
  article-title: On extending F-measure and G-mean metrics to multi-class problems
  publication-title: Transactions on Information and Communication Technologies
– start-page: 464
  year: 2016
  ident: ref44
  article-title: Generalizing pooling functions in convolutional neural networks: Mixed, gated, and tree
  publication-title: Proc 19th Int Conf Artif Intell
– ident: ref55
  doi: 10.5244/C.25.76
– ident: ref10
  doi: 10.1613/jair.953
– ident: ref27
  doi: 10.1016/j.neucom.2013.05.051
– ident: ref67
  doi: 10.1145/1014052.1014056
– ident: ref3
  doi: 10.1007/s11263-015-0843-8
– ident: ref20
  doi: 10.1007/11538059_91
– volume: 2
  start-page: 2-558
  year: 2004
  ident: ref32
  article-title: Learning classifiers from imbalanced data based on biased minimax probability machine
  publication-title: Proc CVPR
– volume: 1
  year: 2014
  ident: ref39
  publication-title: Neural Networks and Deep Learning
– ident: ref56
  doi: 10.1109/CVPR.2014.476
– start-page: 647
  year: 2014
  ident: ref52
  article-title: DeCAF: A deep convolutional activation feature for generic visual recognition
  publication-title: Proc ICML
– volume: 15
  start-page: 1929
  year: 2014
  ident: ref17
  article-title: Dropout: A simple way to prevent neural networks from overfitting
  publication-title: J Mach Learn Res
– ident: ref9
  doi: 10.1109/ICIP.2016.7532411
– ident: ref30
  doi: 10.1109/TKDE.2005.95
– start-page: 1794
  year: 2009
  ident: ref51
  article-title: Linear spatial pyramid matching using sparse coding for image classification
  publication-title: Proc CVPR
– start-page: 445
  year: 1998
  ident: ref33
  article-title: Cost-sensitive learning with neural networks
  publication-title: Proc ECAI
– ident: ref65
  doi: 10.1109/TKDE.2008.239
– ident: ref50
  doi: 10.1007/978-3-642-15561-1_11
– ident: ref63
  doi: 10.1109/TSMCB.2008.2002909
– ident: ref21
  doi: 10.1007/978-3-642-01307-2_43
SSID ssj0000605649
Score 2.6982079
Snippet Class imbalance is a common problem in the case of real-world object detection and classification tasks. Data of some classes are abundant, making them an...
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 3573
SubjectTerms Artificial neural networks
Australia
Classification
Classifiers
Computer applications
Computer vision
Convolutional neural networks (CNNs)
cost-sensitive (CoSen) learning
data imbalance
Data sampling
Image classification
loss functions
Neural networks
Object recognition
Representations
Tag clouds
Testing
Training
Training data
Title Cost-Sensitive Learning of Deep Feature Representations From Imbalanced Data
URI https://ieeexplore.ieee.org/document/8012579
https://www.ncbi.nlm.nih.gov/pubmed/28829320
https://www.proquest.com/docview/2074852026
https://www.proquest.com/docview/1931243090
Volume 29
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB61PXGhpeWRUpCRuEG2dpzY8RG1rAoqe6CttLfIdiYcoEm1m1z662s7TiQQoN6ixE4cz3j8jecF8L7Ji6xBrVOJUqR5aYW379JUChQcS9GgCdk-V-LiJv-6LtY78HGOhUHE4HyGC38ZbPl1Zwd_VHbqpWkh1S7sOsVtjNWaz1Oow-UioN2MiSzNuFxPMTJUnV6vVpdX3pFLLtx-neWlr2KTOXTp4Av9bUsKNVb-DTfDtrPch2_TgEdvk5-LoTcLe_9HLsfH_tEBPI34k3waGeYZ7GB7CPtTbQcSl_oRXJ512z698u7tXiCSmIf1B-kaco54Rzx2HDZIvgdX2hjB1G7JctPdki-3xrtMWqzJue71c7hZfr4-u0hj5YXU5kz2aWG5MswJAyV1LpBSwzQXOZZGFOjJq0olnSLFC3Q3S8N5bVBjqTRDWlvGX8Be27X4CkiTGc40mlrUTV6bRlvDda0Zq53IZ0onwKbJr2xMS-6rY_yqgnpCVRVoV3naVZF2CXyY-9yNSTn-2_rIT_zcMs55AicTjau4breun8zLInOKaQLv5sduxXkzim6xG7aVg7wOFHGqaAIvR96Y3z2x1PHfv_kanriRlaMD4Qns9ZsB3zhQ05u3gZsfALgM8DY
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV3BbtQwEB2VcoALBQptoICR4ISyje3ESQ4cUJfVLl32QLfS3oKdTDjQJtUmEYJv4Vf4N2zHiQQCbpW4RYmdOJ7x-I39PAPwogwjVqKUfoyx8MMkF2Z_N_BjgYJjIkpUNtrnSszPw3ebaLMD38ezMIhoyWc4MZd2L7-o884slR0baxrFqaNQnuLXL9pBa14vplqaLxmbvV2fzH2XQ8DPQxq3fpTzVFGt1mksQ4FBoKjkIsREiQhNQ9MkjbVLwCPUNxPFeaFQYpJKikGRU67fewNuapwRsf502LiCE2hPQFh8zahgPuPxZjiVE6TH69VqeWaoY_FEIwQWJiZvDtN4VgOm4JdJ0GZ1-TvAtRPdbA9-DF3U81s-T7pWTfJvv0WP_F_78C7ccQibvOmHxD3Yweo-7A3ZK4gzZvuwPKmb1j8zBH5j8omLNPuJ1CWZIl4Rg467LZIPlizszmhVDZlt60uyuFSGFJpjQaaylQ_g_Fr-6SHsVnWFh0BKpjiVqApRlGGhSpkrLgtJaaEnNZpKD-gg7Cx3gddN_o-LzDpgQZpZXcmMrmROVzx4Nda56sOO_LP0vhH0WNLJ2IOjQacyZ5kaXS8Ok4hp19uD5-NjbVPMRpGssO6aTIN6Dft4kAYeHPS6OL57UOFHf_7mM7g1X79fZsvF6vQx3NatTHq65BHsttsOn2gI16qndiQR-HjdavcThINMbg
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Cost-Sensitive+Learning+of+Deep+Feature+Representations+From+Imbalanced+Data&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Khan%2C+Salman+H&rft.au=Hayat%2C+Munawar&rft.au=Bennamoun%2C+Mohammed&rft.au=Sohel%2C+Ferdous+A&rft.date=2018-08-01&rft.issn=2162-2388&rft.eissn=2162-2388&rft.volume=29&rft.issue=8&rft.spage=3573&rft_id=info:doi/10.1109%2FTNNLS.2017.2732482&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon