NTU RGB+D 120: A Large-Scale Benchmark for 3D Human Activity Understanding

Research on depth-based human activity analysis achieved outstanding performance and demonstrated the effectiveness of 3D representation for action recognition. The existing depth-based and RGB+D-based action recognition benchmarks have a number of limitations, including the lack of large-scale trai...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on pattern analysis and machine intelligence Vol. 42; no. 10; pp. 2684 - 2701
Main Authors Liu, Jun, Shahroudy, Amir, Perez, Mauricio, Wang, Gang, Duan, Ling-Yu, Kot, Alex C.
Format Journal Article
LanguageEnglish
Published United States IEEE 01.10.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Research on depth-based human activity analysis achieved outstanding performance and demonstrated the effectiveness of 3D representation for action recognition. The existing depth-based and RGB+D-based action recognition benchmarks have a number of limitations, including the lack of large-scale training samples, realistic number of distinct class categories, diversity in camera views, varied environmental conditions, and variety of human subjects. In this work, we introduce a large-scale dataset for RGB+D human action recognition, which is collected from 106 distinct subjects and contains more than 114 thousand video samples and 8 million frames. This dataset contains 120 different action classes including daily, mutual, and health-related activities. We evaluate the performance of a series of existing 3D activity analysis methods on this dataset, and show the advantage of applying deep learning methods for 3D-based human action recognition. Furthermore, we investigate a novel one-shot 3D activity recognition problem on our dataset, and a simple yet effective Action-Part Semantic Relevance-aware (APSR) framework is proposed for this task, which yields promising results for recognition of the novel action classes. We believe the introduction of this large-scale dataset will enable the community to apply, adapt, and develop various data-hungry learning techniques for depth-based and RGB+D-based human activity understanding.
AbstractList Research on depth-based human activity analysis achieved outstanding performance and demonstrated the effectiveness of 3D representation for action recognition. The existing depth-based and RGB+D-based action recognition benchmarks have a number of limitations, including the lack of large-scale training samples, realistic number of distinct class categories, diversity in camera views, varied environmental conditions, and variety of human subjects. In this work, we introduce a large-scale dataset for RGB+D human action recognition, which is collected from 106 distinct subjects and contains more than 114 thousand video samples and 8 million frames. This dataset contains 120 different action classes including daily, mutual, and health-related activities. We evaluate the performance of a series of existing 3D activity analysis methods on this dataset, and show the advantage of applying deep learning methods for 3D-based human action recognition. Furthermore, we investigate a novel one-shot 3D activity recognition problem on our dataset, and a simple yet effective Action-Part Semantic Relevance-aware (APSR) framework is proposed for this task, which yields promising results for recognition of the novel action classes. We believe the introduction of this large-scale dataset will enable the community to apply, adapt, and develop various data-hungry learning techniques for depth-based and RGB+D-based human activity understanding.
Research on depth-based human activity analysis achieved outstanding performance and demonstrated the effectiveness of 3D representation for action recognition. The existing depth-based and RGB+D-based action recognition benchmarks have a number of limitations, including the lack of large-scale training samples, realistic number of distinct class categories, diversity in camera views, varied environmental conditions, and variety of human subjects. In this work, we introduce a large-scale dataset for RGB+D human action recognition, which is collected from 106 distinct subjects and contains more than 114 thousand video samples and 8 million frames. This dataset contains 120 different action classes including daily, mutual, and health-related activities. We evaluate the performance of a series of existing 3D activity analysis methods on this dataset, and show the advantage of applying deep learning methods for 3D-based human action recognition. Furthermore, we investigate a novel one-shot 3D activity recognition problem on our dataset, and a simple yet effective Action-Part Semantic Relevance-aware (APSR) framework is proposed for this task, which yields promising results for recognition of the novel action classes. We believe the introduction of this large-scale dataset will enable the community to apply, adapt, and develop various data-hungry learning techniques for depth-based and RGB+D-based human activity understanding.Research on depth-based human activity analysis achieved outstanding performance and demonstrated the effectiveness of 3D representation for action recognition. The existing depth-based and RGB+D-based action recognition benchmarks have a number of limitations, including the lack of large-scale training samples, realistic number of distinct class categories, diversity in camera views, varied environmental conditions, and variety of human subjects. In this work, we introduce a large-scale dataset for RGB+D human action recognition, which is collected from 106 distinct subjects and contains more than 114 thousand video samples and 8 million frames. This dataset contains 120 different action classes including daily, mutual, and health-related activities. We evaluate the performance of a series of existing 3D activity analysis methods on this dataset, and show the advantage of applying deep learning methods for 3D-based human action recognition. Furthermore, we investigate a novel one-shot 3D activity recognition problem on our dataset, and a simple yet effective Action-Part Semantic Relevance-aware (APSR) framework is proposed for this task, which yields promising results for recognition of the novel action classes. We believe the introduction of this large-scale dataset will enable the community to apply, adapt, and develop various data-hungry learning techniques for depth-based and RGB+D-based human activity understanding.
Author Duan, Ling-Yu
Shahroudy, Amir
Kot, Alex C.
Wang, Gang
Liu, Jun
Perez, Mauricio
Author_xml – sequence: 1
  givenname: Jun
  orcidid: 0000-0002-4365-4165
  surname: Liu
  fullname: Liu, Jun
  email: jliu029@ntu.edu.sg
  organization: Rapid-Rich Object Search Lab, School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore
– sequence: 2
  givenname: Amir
  orcidid: 0000-0002-1045-6437
  surname: Shahroudy
  fullname: Shahroudy, Amir
  email: amirsh@chalmers.se
  organization: Department of Electrical Engineering, Chalmers University of Technology, Gothenburg, Sweden
– sequence: 3
  givenname: Mauricio
  surname: Perez
  fullname: Perez, Mauricio
  email: mauricio001@ntu.edu.sg
  organization: Rapid-Rich Object Search Lab, School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore
– sequence: 4
  givenname: Gang
  orcidid: 0000-0002-1816-1457
  surname: Wang
  fullname: Wang, Gang
  email: wanggang@ntu.edu.sg
  organization: Alibaba Group, Hangzhou, China
– sequence: 5
  givenname: Ling-Yu
  orcidid: 0000-0002-4491-2023
  surname: Duan
  fullname: Duan, Ling-Yu
  email: lingyu@pku.edu.cn
  organization: National Engineering Laboratory for Video Technology, Peking University, Beijing, China
– sequence: 6
  givenname: Alex C.
  orcidid: 0000-0001-6262-8125
  surname: Kot
  fullname: Kot, Alex C.
  email: eackot@ntu.edu.sg
  organization: Rapid-Rich Object Search Lab, School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore
BackLink https://www.ncbi.nlm.nih.gov/pubmed/31095476$$D View this record in MEDLINE/PubMed
https://research.chalmers.se/publication/519516$$DView record from Swedish Publication Index
BookMark eNp9kUtvEzEUhS1URNPCHwAJjcQGCU2w73j86C5toS0KD9Fkbdmem2bKjCfYM6D-eyYkZNEFq-vFd8491-eEHIUuICEvGZ0yRvX7xbfZ55spUKanoJlQsnhCJsAEzTVoOCITygTkSoE6Jicp3VPKeEmLZ-S4GOUll2JCPn1ZLLPvV-fvLjMG9CybZXMb7zC_9bbB7ByDX7c2_shWXcyKy-x6aG3IZr6vf9X9Q7YMFcbU21DV4e45ebqyTcIX-3lKlh8_LC6u8_nXq5uL2Tz3HGSfV1QwlKtSCdCOr5jTqIREQZn0qnKVLz3XzLHxDR44VKViFLXVKFwJzhWn5Hbnm37jZnBmE-sx4oPpbG0iJrTRr41f26Yds5mERmon0EttVIWl4aV0RqHWplTIBPcMtOCj69ud6yZ2PwdMvWnr5LFpbMBuSAagAMpBKDWibx6h990Qw3izAc4ZpUqWW-r1nhpci9Uh5r-_HwHYAT52KUVcHRBGzbZg87dgsy3Y7AseReqRyNe97esu9NHWzf-lr3bSGhEPu5RkhdJQ_AEz5K5q
CODEN ITPIDJ
CitedBy_id crossref_primary_10_1109_TPAMI_2022_3183112
crossref_primary_10_1016_j_patcog_2025_111343
crossref_primary_10_1109_TNNLS_2022_3201518
crossref_primary_10_1007_s10489_022_03589_y
crossref_primary_10_1016_j_neucom_2023_126915
crossref_primary_10_1109_ACCESS_2025_3540438
crossref_primary_10_1109_TIM_2024_3370958
crossref_primary_10_1016_j_imavis_2024_104985
crossref_primary_10_1109_ACCESS_2020_3049029
crossref_primary_10_1016_j_asoc_2025_112855
crossref_primary_10_1007_s11760_024_03805_x
crossref_primary_10_1109_ACCESS_2022_3229478
crossref_primary_10_3390_s25061841
crossref_primary_10_1007_s11227_024_06573_0
crossref_primary_10_1109_ACCESS_2020_3048172
crossref_primary_10_1109_TBIOM_2022_3215067
crossref_primary_10_1016_j_artmed_2024_102900
crossref_primary_10_1186_s42492_024_00176_5
crossref_primary_10_3390_s23115121
crossref_primary_10_1016_j_neucom_2021_12_054
crossref_primary_10_1016_j_patcog_2021_108043
crossref_primary_10_3390_jimaging11030091
crossref_primary_10_1016_j_neucom_2023_126903
crossref_primary_10_1109_TPAMI_2024_3466212
crossref_primary_10_1109_THMS_2022_3219242
crossref_primary_10_1016_j_engappai_2024_107850
crossref_primary_10_1109_TMM_2021_3050069
crossref_primary_10_3390_s20174871
crossref_primary_10_1109_TMM_2023_3242152
crossref_primary_10_1016_j_neucom_2024_128393
crossref_primary_10_1007_s10489_021_02517_w
crossref_primary_10_1007_s00530_024_01566_8
crossref_primary_10_1109_ACCESS_2024_3378515
crossref_primary_10_3390_app132011199
crossref_primary_10_1049_ipr2_13041
crossref_primary_10_1007_s42979_025_03660_8
crossref_primary_10_1109_ACCESS_2024_3365448
crossref_primary_10_1109_TPAMI_2022_3177813
crossref_primary_10_3390_s24237682
crossref_primary_10_1007_s10489_022_03411_9
crossref_primary_10_1109_JIOT_2023_3263725
crossref_primary_10_1109_TMM_2021_3119177
crossref_primary_10_1145_3411841
crossref_primary_10_3390_s20123499
crossref_primary_10_1007_s11042_022_12091_z
crossref_primary_10_1080_0951192X_2023_2177736
crossref_primary_10_1109_TNNLS_2020_3043002
crossref_primary_10_1016_j_imavis_2023_104689
crossref_primary_10_1117_1_JEI_31_5_053015
crossref_primary_10_1016_j_imavis_2023_104687
crossref_primary_10_1007_s00530_024_01531_5
crossref_primary_10_1109_JSEN_2023_3337367
crossref_primary_10_1109_TCSVT_2021_3077512
crossref_primary_10_1109_ACCESS_2023_3282311
crossref_primary_10_3390_app14219947
crossref_primary_10_1016_j_autcon_2024_105452
crossref_primary_10_1007_s10489_023_05173_4
crossref_primary_10_3390_s21030768
crossref_primary_10_1016_j_cviu_2024_104213
crossref_primary_10_1016_j_cogsys_2021_09_002
crossref_primary_10_1007_s00530_023_01054_5
crossref_primary_10_3390_s22228989
crossref_primary_10_1109_TCSVT_2023_3334872
crossref_primary_10_1109_LRA_2023_3236569
crossref_primary_10_1016_j_engappai_2023_107210
crossref_primary_10_1016_j_patrec_2023_02_024
crossref_primary_10_1007_s00530_024_01358_0
crossref_primary_10_1016_j_eswa_2025_126646
crossref_primary_10_1016_j_neucom_2022_07_080
crossref_primary_10_1109_TCDS_2022_3204905
crossref_primary_10_1109_TIP_2022_3226410
crossref_primary_10_1109_TPAMI_2023_3330935
crossref_primary_10_1109_ACCESS_2022_3186465
crossref_primary_10_1109_TPAMI_2023_3325463
crossref_primary_10_1007_s11760_023_02747_0
crossref_primary_10_1109_ACCESS_2024_3485199
crossref_primary_10_3390_app14167154
crossref_primary_10_3390_app14146335
crossref_primary_10_1109_TCSVT_2024_3375512
crossref_primary_10_1109_THMS_2024_3467334
crossref_primary_10_1016_j_compbiomed_2022_105626
crossref_primary_10_1109_JSEN_2024_3388154
crossref_primary_10_1016_j_neucom_2024_129158
crossref_primary_10_3390_app15010198
crossref_primary_10_1109_ACCESS_2023_3247840
crossref_primary_10_3390_s24248102
crossref_primary_10_1016_j_jvcir_2024_104281
crossref_primary_10_1016_j_jvcir_2024_104286
crossref_primary_10_1109_TPAMI_2022_3157033
crossref_primary_10_1007_s11042_023_17788_3
crossref_primary_10_3390_electronics12071549
crossref_primary_10_1109_TCSVT_2022_3156058
crossref_primary_10_1007_s00521_024_09671_5
crossref_primary_10_3390_jimaging9120262
crossref_primary_10_1007_s00371_024_03420_4
crossref_primary_10_1016_j_patcog_2022_109231
crossref_primary_10_1016_j_image_2024_117244
crossref_primary_10_1109_JSEN_2023_3306819
crossref_primary_10_1007_s11760_022_02442_6
crossref_primary_10_1109_ACCESS_2021_3102671
crossref_primary_10_1109_TPAMI_2022_3231558
crossref_primary_10_1145_3639470
crossref_primary_10_1016_j_neucom_2023_126830
crossref_primary_10_1016_j_neucom_2025_129820
crossref_primary_10_1109_TCSVT_2020_3019293
crossref_primary_10_1109_ACCESS_2024_3431227
crossref_primary_10_1109_ACCESS_2024_3424939
crossref_primary_10_3390_s23094258
crossref_primary_10_1109_ACCESS_2023_3247820
crossref_primary_10_1109_TPAMI_2023_3291663
crossref_primary_10_1109_TAI_2021_3076974
crossref_primary_10_1016_j_dsp_2024_104823
crossref_primary_10_1007_s10462_022_10365_4
crossref_primary_10_1109_ACCESS_2020_3038235
crossref_primary_10_1109_LSP_2024_3356808
crossref_primary_10_1007_s10994_024_06667_z
crossref_primary_10_1109_JSEN_2024_3491183
crossref_primary_10_1016_j_engappai_2024_107957
crossref_primary_10_1007_s11390_020_0405_6
crossref_primary_10_3390_s22113992
crossref_primary_10_1016_j_neucom_2021_09_034
crossref_primary_10_1109_TCE_2024_3384974
crossref_primary_10_1007_s11548_022_02691_3
crossref_primary_10_1109_TPAMI_2020_3032738
crossref_primary_10_1007_s00371_023_03063_x
crossref_primary_10_3389_fcomp_2023_1203901
crossref_primary_10_1016_j_ins_2021_04_023
crossref_primary_10_1109_TII_2022_3150795
crossref_primary_10_1109_TNNLS_2022_3175480
crossref_primary_10_1109_TNNLS_2023_3298950
crossref_primary_10_1109_TPAMI_2021_3135117
crossref_primary_10_1016_j_jksuci_2023_101641
crossref_primary_10_1016_j_neucom_2024_128086
crossref_primary_10_3233_ICA_210652
crossref_primary_10_1109_ACCESS_2024_3477321
crossref_primary_10_1109_TIP_2024_3497837
crossref_primary_10_1007_s11042_022_13921_w
crossref_primary_10_3390_app14188185
crossref_primary_10_1007_s11760_022_02465_z
crossref_primary_10_1109_TMM_2023_3307933
crossref_primary_10_1109_LSP_2023_3298286
crossref_primary_10_3390_a16040190
crossref_primary_10_1007_s40747_021_00319_8
crossref_primary_10_1145_3700878
crossref_primary_10_1145_3531004
crossref_primary_10_1016_j_patcog_2024_111106
crossref_primary_10_3390_s21186309
crossref_primary_10_1007_s00530_024_01463_0
crossref_primary_10_1109_TPAMI_2024_3355414
crossref_primary_10_3390_s25051521
crossref_primary_10_26599_TST_2024_9010059
crossref_primary_10_1109_TCSVT_2024_3421304
crossref_primary_10_3390_math11092115
crossref_primary_10_1109_TASE_2024_3431128
crossref_primary_10_3390_sym14081547
crossref_primary_10_1016_j_ins_2024_120088
crossref_primary_10_1109_TCSVT_2021_3098839
crossref_primary_10_1109_TCE_2023_3340752
crossref_primary_10_1007_s00371_022_02497_z
crossref_primary_10_1016_j_jksuci_2024_102072
crossref_primary_10_1109_TCSVT_2023_3284493
crossref_primary_10_1109_OJSP_2024_3396635
crossref_primary_10_3390_robotics12060167
crossref_primary_10_1007_s11042_023_17176_x
crossref_primary_10_1016_j_patcog_2024_111118
crossref_primary_10_2139_ssrn_4170495
crossref_primary_10_1007_s11276_021_02694_z
crossref_primary_10_1007_s00500_023_09532_8
crossref_primary_10_1016_j_eswa_2023_122538
crossref_primary_10_1049_ell2_12477
crossref_primary_10_3390_s23156986
crossref_primary_10_1049_ell2_12597
crossref_primary_10_1002_int_22713
crossref_primary_10_1109_TII_2022_3221208
crossref_primary_10_1177_30504554251319445
crossref_primary_10_1016_j_patcog_2024_110262
crossref_primary_10_1109_ACCESS_2021_3131809
crossref_primary_10_1109_TMM_2021_3127040
crossref_primary_10_1016_j_neunet_2024_107114
crossref_primary_10_1109_TIP_2024_3433581
crossref_primary_10_1109_ACCESS_2024_3353138
crossref_primary_10_1080_13682199_2023_2190927
crossref_primary_10_1109_TCSVT_2022_3219864
crossref_primary_10_1145_3679011
crossref_primary_10_1016_j_patcog_2025_111521
crossref_primary_10_1049_ipr2_12472
crossref_primary_10_1109_TMM_2021_3086758
crossref_primary_10_1109_TCSI_2023_3254610
crossref_primary_10_1109_TIP_2024_3378886
crossref_primary_10_1002_adma_202203684
crossref_primary_10_1016_j_knosys_2021_106970
crossref_primary_10_1007_s44196_023_00292_9
crossref_primary_10_1109_TPAMI_2021_3127885
crossref_primary_10_3390_app11188324
crossref_primary_10_3390_s20102758
crossref_primary_10_1109_TII_2024_3431070
crossref_primary_10_1587_transinf_2023EDP7223
crossref_primary_10_1109_LSENS_2024_3475515
crossref_primary_10_1049_ipr2_12469
crossref_primary_10_1016_j_patcog_2023_110188
crossref_primary_10_1109_ACCESS_2021_3115476
crossref_primary_10_1016_j_neunet_2022_12_015
crossref_primary_10_3390_s20247149
crossref_primary_10_1007_s11760_024_03510_9
crossref_primary_10_1007_s10044_024_01319_3
crossref_primary_10_1007_s10479_021_04164_3
crossref_primary_10_1007_s11263_024_02042_6
crossref_primary_10_1016_j_jvcir_2024_104307
crossref_primary_10_1109_TMM_2024_3428330
crossref_primary_10_1109_ACCESS_2022_3206044
crossref_primary_10_3390_s20247271
crossref_primary_10_1109_ACCESS_2021_3130613
crossref_primary_10_1007_s00500_023_09097_6
crossref_primary_10_1109_TNNLS_2024_3360990
crossref_primary_10_1016_j_cagd_2021_101964
crossref_primary_10_3390_technologies10020042
crossref_primary_10_7717_peerj_cs_1881
crossref_primary_10_3390_s20185260
crossref_primary_10_1007_s10489_022_03968_5
crossref_primary_10_1016_j_patcog_2024_110333
crossref_primary_10_1007_s11042_022_11919_y
crossref_primary_10_3390_biomimetics9030123
crossref_primary_10_1016_j_cviu_2024_104229
crossref_primary_10_1109_TPAMI_2023_3238411
crossref_primary_10_1109_ACCESS_2022_3164711
crossref_primary_10_1109_TCSVT_2021_3085959
crossref_primary_10_1109_TCSVT_2021_3101847
crossref_primary_10_1007_s11042_021_10957_2
crossref_primary_10_1016_j_jvcir_2023_104038
crossref_primary_10_1002_cav_2070
crossref_primary_10_1109_LRA_2022_3151614
crossref_primary_10_3390_s24154860
crossref_primary_10_1109_TCSVT_2024_3465845
crossref_primary_10_1109_TPAMI_2024_3413026
crossref_primary_10_1109_JSEN_2023_3322479
crossref_primary_10_1109_TCSVT_2023_3285416
crossref_primary_10_3390_data5040104
crossref_primary_10_1016_j_patcog_2023_110051
crossref_primary_10_1016_j_engappai_2023_106855
crossref_primary_10_3390_electronics11182973
crossref_primary_10_3390_s24237609
crossref_primary_10_1109_TCSVT_2023_3240472
crossref_primary_10_1016_j_neunet_2021_11_011
crossref_primary_10_1007_s00138_022_01291_0
crossref_primary_10_1007_s00521_022_07763_8
crossref_primary_10_3390_electronics12214466
crossref_primary_10_3390_technologies10010033
crossref_primary_10_1007_s00138_024_01598_0
crossref_primary_10_1109_TCSVT_2023_3255832
crossref_primary_10_1016_j_inffus_2023_101967
crossref_primary_10_1007_s00530_020_00677_2
crossref_primary_10_1007_s00371_024_03782_9
crossref_primary_10_1109_TIP_2024_3522818
crossref_primary_10_1177_0278364921990671
crossref_primary_10_1016_j_knosys_2025_113045
crossref_primary_10_1109_TII_2022_3223225
crossref_primary_10_1145_3610907
crossref_primary_10_3390_ai5040106
crossref_primary_10_1049_ipr2_13245
crossref_primary_10_1109_TCSVT_2022_3217763
crossref_primary_10_1109_TCSVT_2022_3178430
crossref_primary_10_3390_s22197117
crossref_primary_10_1016_j_patcog_2024_110317
crossref_primary_10_1016_j_imavis_2024_105137
crossref_primary_10_1109_JSEN_2023_3320155
crossref_primary_10_3390_app12189229
crossref_primary_10_1109_TBIOM_2024_3375453
crossref_primary_10_1007_s40747_024_01558_1
crossref_primary_10_1007_s40747_024_01610_0
crossref_primary_10_1016_j_asoc_2023_110536
crossref_primary_10_1109_TIP_2021_3056895
crossref_primary_10_3390_s24216817
crossref_primary_10_1016_j_patrec_2023_11_010
crossref_primary_10_1109_JSEN_2021_3075722
crossref_primary_10_1007_s00521_020_05162_5
crossref_primary_10_3389_fnbot_2024_1371385
crossref_primary_10_1109_JSEN_2022_3225158
crossref_primary_10_1109_ACCESS_2024_3457571
crossref_primary_10_1016_j_cviu_2024_104012
crossref_primary_10_1007_s00371_022_02603_1
crossref_primary_10_1109_TIP_2022_3193290
crossref_primary_10_1007_s44267_024_00059_6
crossref_primary_10_3390_s23249738
crossref_primary_10_1109_ACCESS_2024_3525185
crossref_primary_10_1016_j_neucom_2022_10_016
crossref_primary_10_1016_j_eswa_2023_122314
crossref_primary_10_3390_s21165339
crossref_primary_10_1109_ACCESS_2022_3201227
crossref_primary_10_1016_j_neucom_2025_130066
crossref_primary_10_26599_BDMA_2024_9020095
crossref_primary_10_3390_electronics10192412
crossref_primary_10_1109_TCSVT_2021_3050807
crossref_primary_10_3390_s23218921
crossref_primary_10_1109_TAI_2023_3329799
crossref_primary_10_1016_j_imavis_2025_105437
crossref_primary_10_1049_ipr2_12729
crossref_primary_10_1109_TNNLS_2023_3252172
crossref_primary_10_1109_TIP_2024_3391913
crossref_primary_10_1007_s11036_023_02251_2
crossref_primary_10_1109_LRA_2021_3101822
crossref_primary_10_1049_ipr2_12847
crossref_primary_10_1109_TCSVT_2023_3248782
crossref_primary_10_1109_TCSVT_2024_3491133
crossref_primary_10_1016_j_cviu_2023_103722
crossref_primary_10_1109_TMM_2023_3239751
crossref_primary_10_1109_ACCESS_2022_3144456
crossref_primary_10_1109_TIM_2024_3509573
crossref_primary_10_1007_s10044_023_01179_3
crossref_primary_10_1145_3657632
crossref_primary_10_1109_TMM_2023_3318325
crossref_primary_10_1016_j_jvcir_2020_102833
crossref_primary_10_1016_j_patcog_2024_110427
crossref_primary_10_3390_s21206774
crossref_primary_10_1109_TCSVT_2020_3015051
crossref_primary_10_1109_TCDS_2021_3103960
crossref_primary_10_1007_s40747_025_01811_1
crossref_primary_10_3390_app12031402
crossref_primary_10_1109_TNNLS_2023_3312673
crossref_primary_10_1016_j_neucom_2022_09_061
crossref_primary_10_1016_j_procs_2024_08_028
crossref_primary_10_1016_j_imavis_2021_104111
crossref_primary_10_3390_sym13122275
crossref_primary_10_1007_s11263_023_01967_8
crossref_primary_10_1049_ipr2_12872
crossref_primary_10_1109_TMM_2021_3050642
crossref_primary_10_1007_s11263_020_01354_7
crossref_primary_10_1109_TMM_2022_3198011
crossref_primary_10_1109_TCSVT_2022_3197395
crossref_primary_10_3389_fnbot_2022_1091361
crossref_primary_10_1016_j_cviu_2025_104297
crossref_primary_10_1109_TETCI_2024_3358103
crossref_primary_10_1109_ACCESS_2024_3437371
crossref_primary_10_1016_j_neucom_2022_09_071
crossref_primary_10_12677_CSA_2023_133045
crossref_primary_10_1007_s00530_024_01592_6
crossref_primary_10_11834_jig_230215
crossref_primary_10_1109_ACCESS_2021_3073107
crossref_primary_10_11834_jig_220895
crossref_primary_10_3390_electronics12234793
crossref_primary_10_1109_TCSVT_2023_3236430
crossref_primary_10_1007_s11263_021_01470_y
crossref_primary_10_1016_j_cag_2022_12_008
crossref_primary_10_1109_ACCESS_2021_3063302
crossref_primary_10_1109_TCSVT_2022_3232373
crossref_primary_10_3724_SP_J_1089_2022_19098
crossref_primary_10_3390_s23052597
crossref_primary_10_1007_s00521_024_09559_4
crossref_primary_10_1007_s11831_023_09986_x
crossref_primary_10_1007_s10044_023_01156_w
crossref_primary_10_1109_TIM_2021_3108235
crossref_primary_10_1016_j_dib_2022_107896
crossref_primary_10_1109_LSP_2021_3049691
crossref_primary_10_3390_app10041482
crossref_primary_10_3390_s20102905
crossref_primary_10_1007_s11554_025_01625_x
crossref_primary_10_1109_TMM_2023_3321502
crossref_primary_10_1109_TCSVT_2021_3076165
crossref_primary_10_3390_app112210979
crossref_primary_10_3390_s23020876
crossref_primary_10_1016_j_neucom_2022_03_069
crossref_primary_10_1109_TIP_2022_3207577
crossref_primary_10_3390_electronics13183737
crossref_primary_10_1016_j_neucom_2023_126289
crossref_primary_10_1109_TCSVT_2021_3124562
crossref_primary_10_3390_math12203245
crossref_primary_10_1007_s13735_024_00341_9
crossref_primary_10_1016_j_mfglet_2024_09_172
crossref_primary_10_1016_j_neucom_2024_128623
crossref_primary_10_1016_j_eswa_2022_116566
crossref_primary_10_1016_j_sigpro_2024_109592
crossref_primary_10_1016_j_neucom_2025_130015
crossref_primary_10_1109_TNSRE_2024_3416159
crossref_primary_10_1109_TCSVT_2024_3491176
crossref_primary_10_1007_s00500_023_07862_1
crossref_primary_10_3390_electronics12132852
crossref_primary_10_1016_j_eswa_2024_124980
crossref_primary_10_1016_j_cviu_2023_103805
crossref_primary_10_1109_TIP_2022_3230249
crossref_primary_10_1109_MIS_2022_3147585
crossref_primary_10_1007_s00521_023_08814_4
crossref_primary_10_1002_cav_2221
crossref_primary_10_1109_TCSVT_2021_3058688
crossref_primary_10_1016_j_patcog_2023_109455
crossref_primary_10_1016_j_jvcir_2025_104389
crossref_primary_10_1109_TIP_2021_3086590
crossref_primary_10_1016_j_cviu_2024_104076
crossref_primary_10_3390_electronics13122294
crossref_primary_10_1109_TCSVT_2023_3312049
crossref_primary_10_1016_j_cviu_2024_104070
crossref_primary_10_1016_j_neucom_2024_128636
crossref_primary_10_1007_s10489_022_04302_9
crossref_primary_10_1111_exsy_13182
crossref_primary_10_1109_JSEN_2023_3267300
crossref_primary_10_1109_TCSVT_2024_3402952
crossref_primary_10_1016_j_neucom_2025_130007
crossref_primary_10_1109_LSP_2023_3333210
crossref_primary_10_3390_s23052452
crossref_primary_10_1109_JIOT_2020_3042986
crossref_primary_10_3390_s23020734
crossref_primary_10_1016_j_neucom_2023_03_001
crossref_primary_10_1109_ACCESS_2023_3307138
crossref_primary_10_1109_TCSVT_2024_3410301
crossref_primary_10_1145_3450410
crossref_primary_10_1109_TIP_2025_3528218
crossref_primary_10_1109_ACCESS_2025_3545787
crossref_primary_10_1016_j_neucom_2022_10_084
crossref_primary_10_1109_TCSVT_2024_3358836
crossref_primary_10_3390_app142210286
crossref_primary_10_1007_s40747_023_01171_8
crossref_primary_10_1109_JSTSP_2023_3262358
crossref_primary_10_1049_ipr2_12309
crossref_primary_10_1007_s11431_023_2491_4
crossref_primary_10_1109_LSP_2024_3525398
crossref_primary_10_1007_s40747_022_00914_3
crossref_primary_10_3233_KES_208195
crossref_primary_10_1016_j_knosys_2024_112868
crossref_primary_10_3390_electronics10080971
crossref_primary_10_1016_j_engappai_2024_109569
crossref_primary_10_1109_JSEN_2021_3089705
crossref_primary_10_1109_ACCESS_2023_3309420
crossref_primary_10_1109_JSEN_2024_3354922
crossref_primary_10_1109_TIP_2023_3338410
crossref_primary_10_1016_j_aei_2023_102129
crossref_primary_10_1038_s41598_024_65850_3
crossref_primary_10_1109_ACCESS_2022_3214659
crossref_primary_10_1109_TIFS_2021_3130437
crossref_primary_10_32604_cmc_2022_024422
crossref_primary_10_1016_j_imavis_2021_104333
crossref_primary_10_1049_cvi2_12080
crossref_primary_10_1007_s10489_024_05645_1
crossref_primary_10_1016_j_patrec_2025_02_020
crossref_primary_10_1007_s11042_024_18864_y
crossref_primary_10_1016_j_patcog_2023_109540
crossref_primary_10_1109_ACCESS_2020_3014445
crossref_primary_10_3390_ai5030083
crossref_primary_10_3390_s23146397
crossref_primary_10_1007_s00530_024_01547_x
crossref_primary_10_1016_j_knosys_2024_112319
crossref_primary_10_1007_s10994_022_06141_8
crossref_primary_10_3390_s23042182
crossref_primary_10_1109_ACCESS_2021_3101716
crossref_primary_10_1007_s11265_024_01922_x
crossref_primary_10_1007_s11263_024_02070_2
crossref_primary_10_1016_j_neucom_2024_128426
crossref_primary_10_1109_TCE_2024_3420936
crossref_primary_10_26599_TST_2021_9010068
crossref_primary_10_1016_j_neucom_2022_07_046
crossref_primary_10_5687_iscie_37_80
crossref_primary_10_1051_itmconf_20214003014
crossref_primary_10_1117_1_JEI_33_6_063056
crossref_primary_10_1109_ACCESS_2023_3325750
crossref_primary_10_3390_s22239249
crossref_primary_10_1016_j_eswa_2023_123061
crossref_primary_10_1109_ACCESS_2021_3061626
crossref_primary_10_1109_TMM_2024_3521767
crossref_primary_10_1109_TPAMI_2022_3161735
crossref_primary_10_1109_TIP_2024_3372451
crossref_primary_10_2196_51996
crossref_primary_10_1109_ACCESS_2024_3351888
crossref_primary_10_1155_2022_8411550
crossref_primary_10_1109_TIP_2021_3104182
crossref_primary_10_1186_s13634_024_01156_w
crossref_primary_10_1109_ACCESS_2020_3046142
crossref_primary_10_3390_a17100434
crossref_primary_10_1109_TMM_2024_3521774
crossref_primary_10_1109_LSP_2022_3142675
crossref_primary_10_3233_AIC_210250
crossref_primary_10_3390_s23125593
crossref_primary_10_1109_TIP_2023_3286254
crossref_primary_10_34133_cbsystems_0100
crossref_primary_10_1016_j_imavis_2023_104750
crossref_primary_10_1049_cit2_12366
crossref_primary_10_1109_TPAMI_2024_3382117
crossref_primary_10_1007_s00371_023_03132_1
crossref_primary_10_34133_cbsystems_0102
crossref_primary_10_1002_pra2_778
crossref_primary_10_1016_j_imavis_2024_104919
crossref_primary_10_1109_TMM_2023_3235300
crossref_primary_10_1007_s00521_022_07584_9
crossref_primary_10_1142_S0129065723500028
crossref_primary_10_3390_jimaging10110269
crossref_primary_10_3233_IDA_230399
crossref_primary_10_1109_ACCESS_2023_3325401
crossref_primary_10_1145_3472722
crossref_primary_10_1016_j_compeleceng_2023_108846
crossref_primary_10_1109_TIP_2021_3129117
crossref_primary_10_1016_j_neucom_2020_03_126
crossref_primary_10_1093_ptj_pzad176
crossref_primary_10_1007_s10489_024_05544_5
crossref_primary_10_3233_AIC_220268
crossref_primary_10_3390_s22031178
crossref_primary_10_1016_j_asoc_2025_112797
crossref_primary_10_1109_ACCESS_2024_3389499
crossref_primary_10_3390_math10213923
crossref_primary_10_1109_ACCESS_2024_3359234
crossref_primary_10_1016_j_neucom_2025_129697
crossref_primary_10_1109_ACCESS_2024_3378111
crossref_primary_10_32604_cmc_2023_042869
crossref_primary_10_1016_j_ins_2024_121832
crossref_primary_10_1007_s10514_022_10074_5
crossref_primary_10_3390_electronics12091992
crossref_primary_10_1109_ACCESS_2023_3274658
crossref_primary_10_1016_j_comcom_2023_12_021
crossref_primary_10_1038_s41598_023_46155_3
crossref_primary_10_1007_s10462_023_10687_x
crossref_primary_10_1007_s11042_023_18045_3
crossref_primary_10_1109_TMM_2024_3521749
crossref_primary_10_1016_j_jvcir_2023_103957
crossref_primary_10_11834_jig_230046
crossref_primary_10_1109_TIP_2022_3180585
crossref_primary_10_1007_s10489_023_04978_7
crossref_primary_10_1109_TAI_2024_3430260
crossref_primary_10_1007_s00138_023_01386_2
crossref_primary_10_1109_TCSVT_2024_3416732
crossref_primary_10_3390_s21062051
crossref_primary_10_1109_ACCESS_2024_3432776
crossref_primary_10_3390_s22186841
crossref_primary_10_1109_ACCESS_2021_3051842
crossref_primary_10_11834_jig_220028
crossref_primary_10_3390_s23125414
crossref_primary_10_1007_s40747_024_01660_4
crossref_primary_10_1007_s00530_023_01251_2
crossref_primary_10_1016_j_cviu_2023_103655
crossref_primary_10_1109_TPDS_2024_3466891
crossref_primary_10_1016_j_compbiomed_2025_109838
crossref_primary_10_1016_j_knosys_2022_109741
crossref_primary_10_1007_s10845_024_02420_4
crossref_primary_10_1007_s11633_023_1487_8
crossref_primary_10_1016_j_knosys_2024_112480
crossref_primary_10_3390_s20174946
crossref_primary_10_1016_j_neucom_2022_07_002
crossref_primary_10_1016_j_neucom_2025_129433
crossref_primary_10_1007_s11263_021_01550_z
crossref_primary_10_1016_j_neucom_2024_127496
crossref_primary_10_1038_s41597_022_01843_z
crossref_primary_10_1016_j_neucom_2024_127495
crossref_primary_10_1109_LSP_2022_3227816
crossref_primary_10_1016_j_imavis_2021_104141
crossref_primary_10_1007_s11042_021_11026_4
crossref_primary_10_1007_s11263_022_01594_9
crossref_primary_10_3390_electronics12234772
crossref_primary_10_1016_j_knosys_2022_109854
crossref_primary_10_1016_j_eswa_2025_126814
crossref_primary_10_1016_j_cviu_2023_103764
crossref_primary_10_1016_j_patcog_2025_111377
crossref_primary_10_1016_j_patcog_2021_108487
crossref_primary_10_1016_j_cag_2023_10_011
crossref_primary_10_1016_j_imavis_2024_104928
crossref_primary_10_1109_TNNLS_2023_3247103
crossref_primary_10_1111_cgf_14426
crossref_primary_10_1016_j_heliyon_2024_e31825
crossref_primary_10_1109_ACCESS_2023_3320268
crossref_primary_10_3390_technologies12070096
crossref_primary_10_1109_JSEN_2023_3316137
crossref_primary_10_3390_s21124246
crossref_primary_10_1007_s11042_024_20484_5
crossref_primary_10_1016_j_neucom_2025_129781
crossref_primary_10_1016_j_neucom_2023_03_070
crossref_primary_10_1109_TPAMI_2022_3170511
crossref_primary_10_1371_journal_pone_0292026
crossref_primary_10_1177_02783649241274794
crossref_primary_10_1109_TMM_2021_3129616
crossref_primary_10_1360_SSI_2023_0047
crossref_primary_10_1038_s41598_022_08157_5
crossref_primary_10_1049_ipr2_12944
crossref_primary_10_1109_LSP_2024_3351070
crossref_primary_10_3390_s21031005
crossref_primary_10_1007_s11263_021_01551_y
crossref_primary_10_1109_TCSVT_2024_3386553
crossref_primary_10_1016_j_imavis_2023_104762
crossref_primary_10_1049_cit2_12014
crossref_primary_10_3390_technologies8040055
crossref_primary_10_12677_JISP_2023_122019
crossref_primary_10_1049_cit2_12012
crossref_primary_10_3390_s22062091
crossref_primary_10_1007_s00521_022_07826_w
crossref_primary_10_1109_TMM_2021_3139768
crossref_primary_10_1109_TMM_2024_3405712
crossref_primary_10_1007_s11042_023_17345_y
crossref_primary_10_3390_s23146364
crossref_primary_10_1145_3550331
crossref_primary_10_1109_JSEN_2022_3150225
crossref_primary_10_1007_s10489_022_03436_0
crossref_primary_10_1007_s10462_024_10934_9
crossref_primary_10_1109_TCSVT_2022_3194350
crossref_primary_10_3390_app13042058
crossref_primary_10_1007_s10489_022_04442_y
crossref_primary_10_1016_j_jjimei_2022_100142
crossref_primary_10_1109_ACCESS_2022_3202164
crossref_primary_10_1016_j_ipm_2022_102950
crossref_primary_10_1109_ACCESS_2024_3520172
crossref_primary_10_1109_TMM_2022_3222681
crossref_primary_10_1109_TPAMI_2024_3363831
Cites_doi 10.1109/CVPR.2017.486
10.1016/j.patrec.2014.04.011
10.1109/ICCV.2017.621
10.1109/CVPR.2013.98
10.1109/CVPR.2018.00157
10.1109/ICME.2017.8019438
10.1109/CVPR.2014.82
10.1109/THMS.2015.2504550
10.1109/TMM.2018.2802648
10.1109/ICCV.2011.6126543
10.1109/TPAMI.2018.2863279
10.1145/2647868.2654912
10.1109/ISCCSP.2014.6877819
10.1109/TPAMI.2019.2898954
10.1109/CVPRW.2012.6239175
10.1109/CVPR.2014.104
10.1109/CVPR.2017.658
10.1109/CVPR.2015.7298708
10.1109/CVPR.2017.52
10.1109/CVPR.2017.387
10.1109/CVPR.2018.00871
10.1109/CVPR.2015.7298860
10.1145/2733373.2806315
10.1109/TPAMI.2016.2533389
10.1109/CVPR.2018.00127
10.1109/ICIP.2015.7350781
10.1016/j.cviu.2018.04.007
10.18653/v1/N16-1174
10.1109/CVPR.2018.00056
10.1109/ICCV.2013.406
10.1109/CVPR.2017.713
10.1109/ICCV.2017.316
10.1109/CVPR.2014.108
10.1007/978-3-030-01234-2_21
10.1109/TPAMI.2017.2691321
10.1016/B978-0-12-813445-0.00005-8
10.1109/CVPR.2018.00558
10.1016/j.patcog.2017.02.030
10.1109/CVPR.2018.00300
10.1109/ICCV.2013.227
10.1145/3077136.3080831
10.1109/CVPR.2014.339
10.1109/ICCV.2015.460
10.1109/ICCVW.2011.6130379
10.1109/TPAMI.2014.2316828
10.1109/CVPR.2017.751
10.1109/WACV.2014.6836044
10.1109/CVPR.2012.6247813
10.1109/CVPRW.2012.6239233
10.1109/ASRU.2013.6707742
10.1109/CVPR.2017.391
10.1109/CVPR.2018.00126
10.1016/j.patrec.2016.06.012
10.1109/TPAMI.2006.79
10.1109/TIP.2018.2812099
10.1007/s11263-014-0777-6
10.1016/j.patcog.2016.01.020
10.1016/j.patcog.2015.11.019
10.1109/TCSII.2019.2899829
10.1109/CVPR.2015.7299172
10.1109/ICCV.2017.233
10.1109/TPAMI.2013.198
10.1016/j.cviu.2017.01.011
10.1109/CVPRW.2017.165
10.1109/CVPR.2018.00539
10.1109/TIP.2017.2785279
10.1016/j.patcog.2016.05.019
10.1109/ICCV.2017.394
10.1142/S0218001415550083
10.1109/TPAMI.2019.2894422
10.1109/CVPRW.2017.207
10.1007/978-3-642-38628-2_4
10.1109/LSP.2017.2690339
10.1109/TPAMI.2016.2640292
10.1109/TPAMI.2015.2505295
10.1109/CVPR.2016.115
10.1109/CVPR.2015.7298698
10.1177/0278364913478446
10.1109/TPAMI.2017.2771306
10.1109/CVPR.2017.137
10.1109/ICPR.2014.772
10.1109/CVPR.2017.498
10.1109/TPAMI.2015.2513479
10.1109/CVPR.2011.5995316
10.1109/CVPR.2015.7298932
10.1109/CVPR.2016.309
10.1109/CVPRW.2010.5543273
10.1109/TCYB.2013.2265378
10.1016/j.patrec.2013.02.006
10.1109/MMUL.2012.24
10.3115/v1/D14-1162
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
DBID 97E
RIA
RIE
AAYXX
CITATION
NPM
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
ADTPV
AOWAS
F1S
DOI 10.1109/TPAMI.2019.2916873
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
PubMed
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
SwePub
SwePub Articles
SWEPUB Chalmers tekniska högskola
DatabaseTitle CrossRef
PubMed
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitleList Technology Research Database

MEDLINE - Academic

PubMed
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISSN 2160-9292
1939-3539
EndPage 2701
ExternalDocumentID oai_research_chalmers_se_79b6ec79_8de5_457b_8e99_58e164c12964
31095476
10_1109_TPAMI_2019_2916873
8713892
Genre orig-research
Research Support, Non-U.S. Gov't
Journal Article
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 61661146005; U1611461
  funderid: 10.13039/501100001809
– fundername: National Basic Research Program of China (973 Program)
  grantid: 2015CB351806
  funderid: 10.13039/501100012166
– fundername: National Research Foundation
  grantid: NRF2016NRF-NSFC001-098
  funderid: 10.13039/100011512
GroupedDBID ---
-DZ
-~X
.DC
0R~
29I
4.4
53G
5GY
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACNCT
AENEX
AGQYO
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
E.L
EBS
EJD
F5P
HZ~
IEDLZ
IFIPE
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNS
RXW
TAE
TN5
UHB
~02
AAYXX
CITATION
5VS
9M8
ABFSI
ADRHT
AETEA
AETIX
AGSQL
AI.
AIBXA
ALLEH
FA8
H~9
IBMZZ
ICLAB
IFJZH
NPM
RIG
RNI
RZB
VH1
XJT
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
ADTPV
AOWAS
F1S
ID FETCH-LOGICAL-c427t-d061e7f58629b4f1b9e867e6017c8dbdc5c491b18db2c242d5810e9a9e6b52bb3
IEDL.DBID RIE
ISSN 0162-8828
1939-3539
IngestDate Thu Aug 21 06:40:35 EDT 2025
Fri Jul 11 01:32:17 EDT 2025
Mon Jun 30 02:35:38 EDT 2025
Mon Jul 21 06:01:43 EDT 2025
Tue Jul 01 03:18:24 EDT 2025
Thu Apr 24 23:01:46 EDT 2025
Wed Aug 27 02:31:42 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 10
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c427t-d061e7f58629b4f1b9e867e6017c8dbdc5c491b18db2c242d5810e9a9e6b52bb3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0002-4491-2023
0000-0001-6262-8125
0000-0002-1045-6437
0000-0002-4365-4165
0000-0002-1816-1457
PMID 31095476
PQID 2441008758
PQPubID 85458
PageCount 18
ParticipantIDs pubmed_primary_31095476
proquest_journals_2441008758
crossref_primary_10_1109_TPAMI_2019_2916873
ieee_primary_8713892
crossref_citationtrail_10_1109_TPAMI_2019_2916873
swepub_primary_oai_research_chalmers_se_79b6ec79_8de5_457b_8e99_58e164c12964
proquest_miscellaneous_2232042688
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2020-10-01
PublicationDateYYYYMMDD 2020-10-01
PublicationDate_xml – month: 10
  year: 2020
  text: 2020-10-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: New York
PublicationTitle IEEE transactions on pattern analysis and machine intelligence
PublicationTitleAbbrev TPAMI
PublicationTitleAlternate IEEE Trans Pattern Anal Mach Intell
PublicationYear 2020
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref57
ye (ref29) 2013
ref56
ref59
ref58
ref53
ref52
ref54
simonyan (ref110) 2015
ref51
ref50
ref45
ref47
ref42
ref41
howard (ref109) 2017
ref44
ref43
ref49
zhu (ref60) 2016
ref7
sung (ref12) 2011
ref4
ref3
ref5
ref100
ref101
ref40
ref34
ref37
ref36
ref31
ref30
ref33
ref32
han (ref35) 2017; 158
ref39
liu (ref46) 2017
ref38
ref24
ref23
ref26
ref25
ref20
ref22
simonyan (ref111) 0
ref27
ohn-bar (ref55) 2013
ref13
ref15
ref14
ref97
ref96
ref99
ref11
ref98
ref10
ref17
kone?n? (ref92) 2014; 15
ref19
ref18
soomro (ref6) 2012
koch (ref84) 2015
han (ref1) 2013; 43
ref93
ravi (ref85) 2017
ref95
ref94
cheng (ref16) 2012
ref91
vinyals (ref89) 0
ref86
fanello (ref90) 2013
ref88
ref87
ke (ref64) 2018
baradel (ref69) 2017
ref82
ref81
kay (ref8) 0
ref83
ref80
ref79
ref108
ref78
ref106
ref107
ref75
ref104
ref74
ref105
ref77
ref102
ref76
ref103
ref2
rahmani (ref21) 2014
ref71
ref70
ref112
ref73
ref72
ref68
ref67
ref63
mikolov (ref28) 2013
ref66
ref65
du (ref9) 2015
liu (ref48) 2016
ref62
ref61
References_xml – start-page: 568
  year: 0
  ident: ref111
  article-title: Two-stream convolutional networks for action recognition in videos
  publication-title: Proc 27th Int Conf Neural Inf Process Syst
– ident: ref49
  doi: 10.1109/CVPR.2017.486
– ident: ref4
  doi: 10.1016/j.patrec.2014.04.011
– ident: ref66
  doi: 10.1109/ICCV.2017.621
– ident: ref18
  doi: 10.1109/CVPR.2013.98
– ident: ref86
  doi: 10.1109/CVPR.2018.00157
– ident: ref76
  doi: 10.1109/ICME.2017.8019438
– ident: ref54
  doi: 10.1109/CVPR.2014.82
– ident: ref10
  doi: 10.1109/THMS.2015.2504550
– ident: ref73
  doi: 10.1109/TMM.2018.2802648
– ident: ref5
  doi: 10.1109/ICCV.2011.6126543
– start-page: 1110
  year: 2015
  ident: ref9
  article-title: Hierarchical recurrent neural network for skeleton based action recognition
  publication-title: Proc IEEE Conf Comput Vis Pattern Recognit
– ident: ref102
  doi: 10.1109/TPAMI.2018.2863279
– ident: ref22
  doi: 10.1145/2647868.2654912
– ident: ref57
  doi: 10.1109/ISCCSP.2014.6877819
– ident: ref103
  doi: 10.1109/TPAMI.2019.2898954
– year: 0
  ident: ref8
  article-title: The kinetics human action video dataset
  publication-title: arXiv preprint arXiv 1705 06950
– ident: ref45
  doi: 10.1109/CVPRW.2012.6239175
– ident: ref40
  doi: 10.1109/CVPR.2014.104
– ident: ref88
  doi: 10.1109/CVPR.2017.658
– ident: ref44
  doi: 10.1109/CVPR.2015.7298708
– ident: ref81
  doi: 10.1109/CVPR.2017.52
– ident: ref78
  doi: 10.1109/CVPR.2017.387
– ident: ref108
  doi: 10.1109/CVPR.2018.00871
– ident: ref83
  doi: 10.1109/CVPR.2015.7298860
– ident: ref25
  doi: 10.1145/2733373.2806315
– year: 2017
  ident: ref69
  article-title: Pose-conditioned spatio-temporal attention for human action recognition
  publication-title: arXiv preprint arXiv 1703 10593
– ident: ref24
  doi: 10.1109/TPAMI.2016.2533389
– ident: ref107
  doi: 10.1109/CVPR.2018.00127
– ident: ref23
  doi: 10.1109/ICIP.2015.7350781
– ident: ref32
  doi: 10.1016/j.cviu.2018.04.007
– ident: ref99
  doi: 10.18653/v1/N16-1174
– ident: ref70
  doi: 10.1109/CVPR.2018.00056
– ident: ref19
  doi: 10.1109/ICCV.2013.406
– ident: ref101
  doi: 10.1109/CVPR.2017.713
– start-page: 1
  year: 2017
  ident: ref85
  article-title: Optimization as a model for few-shot learning
  publication-title: Proc Int Conf Learn Representations
– ident: ref63
  doi: 10.1109/ICCV.2017.316
– ident: ref50
  doi: 10.1109/CVPR.2014.108
– ident: ref72
  doi: 10.1007/978-3-030-01234-2_21
– ident: ref43
  doi: 10.1109/TPAMI.2017.2691321
– start-page: 47
  year: 2011
  ident: ref12
  article-title: Human activity detection from RGBD images
  publication-title: Proc AAAI Conf Plan Activity Intent Recognit
– year: 2017
  ident: ref109
  article-title: MobileNets: Efficient convolutional neural networks for mobile vision applications
  publication-title: arXiv preprint arXiv 1704 04861
– ident: ref75
  doi: 10.1016/B978-0-12-813445-0.00005-8
– ident: ref77
  doi: 10.1109/CVPR.2018.00558
– start-page: 1
  year: 2015
  ident: ref84
  article-title: Siamese neural networks for one-shot image recognition
  publication-title: Proc 32nd Int Conf Mach Learn
– start-page: 3637
  year: 0
  ident: ref89
  article-title: Matching networks for one shot learning
  publication-title: Proc 30th Int Conf Neural Inf Process Syst
– ident: ref104
  doi: 10.1016/j.patcog.2017.02.030
– start-page: 52
  year: 2012
  ident: ref16
  article-title: Human daily action analysis with multi-view and color-depth data
  publication-title: Proc Eur Conf Comput Vis Workshops
– ident: ref39
  doi: 10.1109/CVPR.2018.00300
– ident: ref42
  doi: 10.1109/ICCV.2013.227
– ident: ref27
  doi: 10.1145/3077136.3080831
– ident: ref20
  doi: 10.1109/CVPR.2014.339
– ident: ref59
  doi: 10.1109/ICCV.2015.460
– ident: ref13
  doi: 10.1109/ICCVW.2011.6130379
– ident: ref2
  doi: 10.1109/TPAMI.2014.2316828
– ident: ref61
  doi: 10.1109/CVPR.2017.751
– ident: ref58
  doi: 10.1109/WACV.2014.6836044
– volume: 15
  start-page: 2513
  year: 2014
  ident: ref92
  article-title: One-shot-learning gesture recognition using HOG-HOF features
  publication-title: J Mach Learn Res
– ident: ref14
  doi: 10.1109/CVPR.2012.6247813
– ident: ref15
  doi: 10.1109/CVPRW.2012.6239233
– ident: ref95
  doi: 10.1109/ASRU.2013.6707742
– ident: ref30
  doi: 10.1109/CVPR.2017.391
– ident: ref71
  doi: 10.1109/CVPR.2018.00126
– start-page: 816
  year: 2016
  ident: ref48
  article-title: Spatio-temporal LSTM with trust gates for 3D human action recognition
  publication-title: Proc Eur Conf Comput Vis
– ident: ref97
  doi: 10.1016/j.patrec.2016.06.012
– ident: ref87
  doi: 10.1109/TPAMI.2006.79
– ident: ref106
  doi: 10.1109/TIP.2018.2812099
– ident: ref3
  doi: 10.1007/s11263-014-0777-6
– ident: ref93
  doi: 10.1016/j.patcog.2016.01.020
– ident: ref33
  doi: 10.1016/j.patcog.2015.11.019
– start-page: 149
  year: 2013
  ident: ref29
  article-title: A survey on human motion analysis from depth data
  publication-title: Time-of-Flight and Depth Imaging Sensors Algorithms and Applications
– ident: ref74
  doi: 10.1109/TCSII.2019.2899829
– start-page: 742
  year: 2014
  ident: ref21
  article-title: HOPC: Histogram of oriented principal components of 3D pointclouds for action recognition
  publication-title: Proc Eur Conf Comput Vis
– ident: ref41
  doi: 10.1109/CVPR.2015.7299172
– ident: ref80
  doi: 10.1109/ICCV.2017.233
– year: 2012
  ident: ref6
  article-title: UCF101: A dataset of 101 human actions classes from videos in the wild
  publication-title: arXiv preprint arXiv 1212 0402
– ident: ref56
  doi: 10.1109/TPAMI.2013.198
– volume: 158
  start-page: 85
  year: 2017
  ident: ref35
  article-title: Space-time representation of people based on 3D skeletal data: A review
  publication-title: Comput Vis Image Understanding
  doi: 10.1016/j.cviu.2017.01.011
– ident: ref68
  doi: 10.1109/CVPRW.2017.165
– ident: ref67
  doi: 10.1109/CVPR.2018.00539
– start-page: 465
  year: 2013
  ident: ref55
  article-title: Joint angles similarities and HOG$^2$2 for action recognition
  publication-title: Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit Workshops
– start-page: 3697
  year: 2016
  ident: ref60
  article-title: Co-occurrence feature learning for skeleton based action recognition using regularized deep LSTM networks
  publication-title: Proc 30th AAAI Conf Artif Intell
– ident: ref105
  doi: 10.1109/TIP.2017.2785279
– ident: ref31
  doi: 10.1016/j.patcog.2016.05.019
– ident: ref98
  doi: 10.1109/ICCV.2017.394
– ident: ref36
  doi: 10.1142/S0218001415550083
– start-page: 3111
  year: 2013
  ident: ref28
  article-title: Distributed representations of words and phrases and their compositionality
  publication-title: Proc 26th Int Conf Neural Inf Process Syst
– ident: ref52
  doi: 10.1109/TPAMI.2019.2894422
– ident: ref65
  doi: 10.1109/CVPRW.2017.207
– start-page: 31
  year: 2013
  ident: ref90
  article-title: One-shot learning for real-time action recognition
  publication-title: Proc Iberian Conf Pattern Recognit Image Anal
  doi: 10.1007/978-3-642-38628-2_4
– ident: ref82
  doi: 10.1109/LSP.2017.2690339
– ident: ref26
  doi: 10.1109/TPAMI.2016.2640292
– ident: ref38
  doi: 10.1109/TPAMI.2015.2505295
– ident: ref47
  doi: 10.1109/CVPR.2016.115
– ident: ref7
  doi: 10.1109/CVPR.2015.7298698
– start-page: 1
  year: 2015
  ident: ref110
  article-title: Very deep convolutional networks for large-scale image recognition
  publication-title: Proc Int Conf Learn Representations
– start-page: 1
  year: 2018
  ident: ref64
  article-title: Global regularizer and temporal-aware cross-entropy for skeleton-based early action recognition
  publication-title: Proc 14th Asian Conf Comput Vis
– ident: ref17
  doi: 10.1177/0278364913478446
– ident: ref94
  doi: 10.1109/TPAMI.2017.2771306
– ident: ref62
  doi: 10.1109/CVPR.2017.137
– ident: ref53
  doi: 10.1109/ICPR.2014.772
– ident: ref79
  doi: 10.1109/CVPR.2017.498
– year: 2017
  ident: ref46
  article-title: PKU-MMD: A large scale benchmark for continuous multi-modal human action understanding
  publication-title: arXiv preprint arXiv 1703 08337
– ident: ref91
  doi: 10.1109/TPAMI.2015.2513479
– ident: ref51
  doi: 10.1109/CVPR.2011.5995316
– ident: ref100
  doi: 10.1109/CVPR.2015.7298932
– ident: ref112
  doi: 10.1109/CVPR.2016.309
– ident: ref11
  doi: 10.1109/CVPRW.2010.5543273
– volume: 43
  start-page: 1318
  year: 2013
  ident: ref1
  article-title: Enhanced computer vision with microsoft kinect sensor: A review
  publication-title: IEEE Trans Cybern
  doi: 10.1109/TCYB.2013.2265378
– ident: ref34
  doi: 10.1016/j.patrec.2013.02.006
– ident: ref37
  doi: 10.1109/MMUL.2012.24
– ident: ref96
  doi: 10.3115/v1/D14-1162
SSID ssj0014503
Score 2.735213
Snippet Research on depth-based human activity analysis achieved outstanding performance and demonstrated the effectiveness of 3D representation for action...
SourceID swepub
proquest
pubmed
crossref
ieee
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 2684
SubjectTerms 3D action recognition
Activity understanding
Benchmark testing
Benchmarks
Cameras
Datasets
Deep learning
Human activity recognition
Human motion
large-scale benchmark
Lighting
Machine learning
Performance evaluation
RGB plus D vision
RGB+D vision
Semantics
Skeleton
Three-dimensional displays
video analysis
Title NTU RGB+D 120: A Large-Scale Benchmark for 3D Human Activity Understanding
URI https://ieeexplore.ieee.org/document/8713892
https://www.ncbi.nlm.nih.gov/pubmed/31095476
https://www.proquest.com/docview/2441008758
https://www.proquest.com/docview/2232042688
https://research.chalmers.se/publication/519516
Volume 42
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1NT9wwEB1RTu2hUOhHgFau1BvNEid2Yve2QClFgKp2V-Jm2c6sVqLdrdjdC7--Y-dDK4RQb5HiREnGk3nPM34D8ElMJGqPZaqdE6nIrUitLXhqK3SWQpDLbSyQvS7Px-LiRt5swOd-LwwixuIzHITDmMuv534VlsqOCNxTfKUf7jMibs1erT5jIGTsgkwIhjycaES3QSbTR6Mfw6vvoYpLD3JCQ6oKzXOCIqYUQWpkLR7FBiuPYc0HQqIx-JxtwVX32E3Nye1gtXQDf_9A0fF_32sbXrYolA2bafMKNnC2A1tdhwfWOvwOvFiTK9yFi-vRmP38dnx4yniefWFDdhnqyNNfZGdkx3TJ9I-9u2WEg1lxymJ-gA1905-Cjde30byG8dnX0cl52vZiSL3Iq2VaU9zHaiKJAGknJtxpVGWFROcqr2pXe-mF5o7Tce4p7NdS8Qy11Vg6mTtXvIHN2XyG74AR4wu0zOmJF0KrzBUKC2JhtXTENr1NgHcWMb4VKg_9Mn6bSFgybaJBTTCoaQ2awGF_zd9GpuPJ0bvBCP3I9vsncNAZ3rSevDAEf3iU_VcJfOxPkw-GxIqd4XxFYwiWBi6qaMzbZsL09-7mWQKXzQzqzwRh71bRaWr8NLbLWZgFmkq7En2ljapRGiErZxRqbaRCIreeh0z53uOvsA_P87AgEKsND2BzebfC94Salu5DdJd_ep0Mjw
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV1NbxMxEB2VcgAOFFqggQJGglO16a5jZ20kDimhJG0aIUik3szaO1GklgQ1iRD8Fv4K_42x90NRVXGrxG2l9a7k9VvPe_Z4HsBrMZGoHbYjba2IBM9ElGWtJMpStBmFIMuzkCA7bPfG4vhMnm3A7_osDCKG5DNs-suwl5_P3covlR0Quaf4yssUyhP8-YME2uJdv0uj-Ybzow-j972o9BCInODpMsopXmE6kUTctRWTxGpU7RRJhqRO5TZ30gmd2ISuuaNwlUuVxKgzjW0rubUteu8tuE08Q_LidFi9RyFk8F0mzkRzCgmX6khOrA9GnzqnfZ83ppuc-JdKvV2Pr8EphS9ushYBg6XLdez2SunSEO6OtuBP9aGKLJfz5mppm-7XlRqS_-uXfAD3S57NOsWP8RA2cLYNW5WHBSuntG24t1aQcQeOh6Mx-_zxcL_LEh6_ZR028Jny0RdCMrJDemT6Lbs8Z8T0WavLwg4I67jCgYON1w8KPYLxjfTvMWzO5jPcBUaa1gtPqydOCK1i21LYIp2ZS0t62mUNSCoEGFeWYveOIBcmSLJYmwAg4wFkSgA1YL9-5ntRiOSfrXf8oNcty_FuwF4FNFPOVQtDBC8JxgaqAa_q2zTL-K2jbIbzFbUh4u3VtqI2TwqA1u-ucN2AQYHY-o4vXV7WrJoaNw2GQAuzQJNq20aXaqNylEbI1BqFWhupkOS7S3wuwNPru_AS7vRGpwMz6A9PnsFd7pc_Qm7lHmwuL1f4nDji0r4IvyqDrzeN6L97smpo
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=NTU+RGB%2BD+120%3A+A+Large-Scale+Benchmark+for+3D+Human+Activity+Understanding&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Liu%2C+Jun&rft.au=Shahroudy%2C+Amir&rft.au=Perez%2C+Mauricio&rft.au=Wang%2C+Gang&rft.date=2020-10-01&rft.issn=0162-8828&rft.eissn=2160-9292&rft.volume=42&rft.issue=10&rft.spage=2684&rft.epage=2701&rft_id=info:doi/10.1109%2FTPAMI.2019.2916873&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TPAMI_2019_2916873
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon