Skeleton Optical Spectra-Based Action Recognition Using Convolutional Neural Networks

This letter presents an effective method to encode the spatiotemporal information of a skeleton sequence into color texture images, referred to as skeleton optical spectra, and employs convolutional neural networks (ConvNets) to learn the discriminative features for action recognition. Such spectrum...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on circuits and systems for video technology Vol. 28; no. 3; pp. 807 - 811
Main Authors Hou, Yonghong, Li, Zhaoyang, Wang, Pichao, Li, Wanqing
Format Journal Article
LanguageEnglish
Published New York IEEE 01.03.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract This letter presents an effective method to encode the spatiotemporal information of a skeleton sequence into color texture images, referred to as skeleton optical spectra, and employs convolutional neural networks (ConvNets) to learn the discriminative features for action recognition. Such spectrum representation makes it possible to use a standard ConvNet architecture to learn suitable "dynamic" features from skeleton sequences without training millions of parameters afresh and it is especially valuable when there is insufficient annotated training video data. Specifically, the encoding consists of four steps: mapping of joint distribution, spectrum coding of joint trajectories, spectrum coding of body parts, and joint velocity weighted saturation and brightness. Experimental results on three widely used datasets have demonstrated the efficacy of the proposed method.
AbstractList This letter presents an effective method to encode the spatiotemporal information of a skeleton sequence into color texture images, referred to as skeleton optical spectra, and employs convolutional neural networks (ConvNets) to learn the discriminative features for action recognition. Such spectrum representation makes it possible to use a standard ConvNet architecture to learn suitable “dynamic” features from skeleton sequences without training millions of parameters afresh and it is especially valuable when there is insufficient annotated training video data. Specifically, the encoding consists of four steps: mapping of joint distribution, spectrum coding of joint trajectories, spectrum coding of body parts, and joint velocity weighted saturation and brightness. Experimental results on three widely used datasets have demonstrated the efficacy of the proposed method.
Author Hou, Yonghong
Wang, Pichao
Li, Wanqing
Li, Zhaoyang
Author_xml – sequence: 1
  givenname: Yonghong
  surname: Hou
  fullname: Hou, Yonghong
  email: houroy@tju.edu.cn
  organization: School of Electronic Information Engineering, Tianjin University, Tianjin, China
– sequence: 2
  givenname: Zhaoyang
  surname: Li
  fullname: Li, Zhaoyang
  email: lizhaoyang@tju.edu.cn
  organization: School of Electronic Information Engineering, Tianjin University, Tianjin, China
– sequence: 3
  givenname: Pichao
  orcidid: 0000-0002-1430-0237
  surname: Wang
  fullname: Wang, Pichao
  email: pw212@uowmail.edu.au
  organization: Advanced Multimedia Research Lab, University of Wollongong, Wollongong, NSW, Australia
– sequence: 4
  givenname: Wanqing
  orcidid: 0000-0002-4427-2687
  surname: Li
  fullname: Li, Wanqing
  email: wanqing@uow.edu.au
  organization: Advanced Multimedia Research Lab, University of Wollongong, Wollongong, NSW, Australia
BookMark eNp9kMtOwzAQRS0EEm3hB2ATiXWKH3GcLEvES6qoRFu2keNMqrQhDrYD4u9x2ooFC1b3ambOaOaO0WmrW0DoiuApITi9XWXLt9WUYhJPaUwTxtITNCKcJyGlmJ96jzkJE0r4ORpbu8WYREkkRmi93EEDTrfBonO1kk2w7EA5I8M7aaEMZsrVvvkKSm_aeu_Xtm43QabbT930Q8VDL9CbvbgvbXb2Ap1VsrFwedQJWj_cr7KncL54fM5m81CxmLiQikIxoYpUxlFZ8aoo4rJMKgURcJXwmFUCKwIyVV4BJMMxVrE3kQBM0oRN0M1hb2f0Rw_W5VvdG3-QzSkREWeeoH4qOUwpo601UOWqdnK43P9ZNznB-RBivg8xH0LMjyF6lP5BO1O_S_P9P3R9gGoA-AWEiGhKUvYDfW6BkQ
CODEN ITCTEM
CitedBy_id crossref_primary_10_1145_3700878
crossref_primary_10_1109_TPAMI_2022_3183112
crossref_primary_10_1109_ACCESS_2019_2931804
crossref_primary_10_1016_j_jvcir_2023_103781
crossref_primary_10_3390_sym12101580
crossref_primary_10_1016_j_ins_2019_07_007
crossref_primary_10_1109_ACCESS_2024_3360929
crossref_primary_10_1109_TCSVT_2019_2890829
crossref_primary_10_3389_fpsyg_2021_663359
crossref_primary_10_1016_j_patrec_2020_07_042
crossref_primary_10_1016_j_eswa_2020_114179
crossref_primary_10_3390_s23042182
crossref_primary_10_1109_TPAMI_2021_3050918
crossref_primary_10_2139_ssrn_4191360
crossref_primary_10_3390_s21093112
crossref_primary_10_1016_j_cmpb_2020_105486
crossref_primary_10_1016_j_neucom_2018_08_066
crossref_primary_10_3389_fnbot_2025_1482281
crossref_primary_10_1007_s00521_021_06097_1
crossref_primary_10_1109_TIP_2017_2708506
crossref_primary_10_1109_TMM_2022_3164261
crossref_primary_10_1016_j_imavis_2024_104985
crossref_primary_10_3390_app12157811
crossref_primary_10_1109_ACCESS_2021_3051375
crossref_primary_10_1016_j_robot_2020_103441
crossref_primary_10_1109_ACCESS_2020_2976496
crossref_primary_10_1109_TNNLS_2021_3061115
crossref_primary_10_1109_LRA_2021_3059624
crossref_primary_10_1016_j_eswa_2023_123061
crossref_primary_10_3390_inventions5030049
crossref_primary_10_1007_s11042_022_13441_7
crossref_primary_10_1007_s10339_020_00986_4
crossref_primary_10_1109_TCYB_2019_2919648
crossref_primary_10_1155_2018_2404089
crossref_primary_10_1016_j_patcog_2019_107053
crossref_primary_10_1109_TIM_2021_3125980
crossref_primary_10_2139_ssrn_4170495
crossref_primary_10_1007_s00371_018_1489_7
crossref_primary_10_1007_s11042_022_14193_0
crossref_primary_10_3390_app12094165
crossref_primary_10_1109_TMM_2020_2990082
crossref_primary_10_1109_TMM_2018_2818329
crossref_primary_10_1109_TCSVT_2024_3360452
crossref_primary_10_1007_s12652_021_03567_1
crossref_primary_10_1007_s12652_019_01239_9
crossref_primary_10_3233_JIFS_189229
crossref_primary_10_1007_s10278_024_01078_x
crossref_primary_10_1007_s10489_021_02517_w
crossref_primary_10_1007_s10489_021_02994_z
crossref_primary_10_1007_s11042_019_08261_1
crossref_primary_10_1109_JSEN_2024_3443308
crossref_primary_10_3390_s23020778
crossref_primary_10_1007_s00521_022_07584_9
crossref_primary_10_1007_s13735_021_00226_1
crossref_primary_10_1109_LSP_2017_2678539
crossref_primary_10_1016_j_neucom_2020_03_126
crossref_primary_10_1016_j_neucom_2020_10_037
crossref_primary_10_1109_TMM_2021_3086758
crossref_primary_10_1111_1755_0998_13861
crossref_primary_10_1007_s10845_021_01815_x
crossref_primary_10_3389_fnbot_2022_1091361
crossref_primary_10_1109_ACCESS_2019_2954744
crossref_primary_10_1109_TII_2019_2910876
crossref_primary_10_3390_electronics12010117
crossref_primary_10_1016_j_asoc_2025_112797
crossref_primary_10_1016_j_inffus_2021_10_009
crossref_primary_10_1016_j_jvcir_2020_102942
crossref_primary_10_1016_j_asoc_2023_110575
crossref_primary_10_1016_j_bspc_2022_103512
crossref_primary_10_1109_TMM_2018_2875510
crossref_primary_10_1109_TMM_2021_3070127
crossref_primary_10_1016_j_neucom_2020_07_068
crossref_primary_10_1049_iet_ipr_2018_5769
crossref_primary_10_1016_j_eswa_2022_118406
crossref_primary_10_3390_inventions4010009
crossref_primary_10_1007_s00521_020_04961_0
crossref_primary_10_32604_csse_2023_034862
crossref_primary_10_1109_TMM_2019_2962304
crossref_primary_10_1016_j_ins_2024_121832
crossref_primary_10_1109_TCSVT_2021_3077512
crossref_primary_10_1145_3441628
crossref_primary_10_1109_TMM_2018_2859620
crossref_primary_10_1109_TNSRE_2021_3091320
crossref_primary_10_1109_ACCESS_2017_2782258
crossref_primary_10_1109_ACCESS_2019_2913393
crossref_primary_10_1109_TCYB_2020_3034755
crossref_primary_10_1109_TETCI_2019_2901540
crossref_primary_10_1007_s10489_024_05719_0
crossref_primary_10_1007_s11042_022_11947_8
crossref_primary_10_3390_app11062675
crossref_primary_10_1016_j_jvcir_2023_103953
crossref_primary_10_1109_TCSVT_2022_3232373
crossref_primary_10_11834_jig_230046
crossref_primary_10_1177_1729881418825093
crossref_primary_10_1007_s00521_022_08049_9
crossref_primary_10_1016_j_imavis_2020_104070
crossref_primary_10_1016_j_jvcir_2020_102772
crossref_primary_10_1007_s11063_022_11012_3
crossref_primary_10_1109_THMS_2018_2883001
crossref_primary_10_3390_iot5040041
crossref_primary_10_1007_s10489_024_05319_y
crossref_primary_10_3390_s22186841
crossref_primary_10_1049_ipr2_12754
crossref_primary_10_1007_s00530_024_01358_0
crossref_primary_10_1007_s11760_020_01644_0
crossref_primary_10_1016_j_jvcir_2019_102691
crossref_primary_10_29252_jsdp_17_4_139
crossref_primary_10_1049_ipr2_13104
crossref_primary_10_2478_amns_2023_1_00149
crossref_primary_10_1007_s10489_020_01905_y
crossref_primary_10_3390_s19051007
crossref_primary_10_3390_s25051567
crossref_primary_10_3390_s21113642
crossref_primary_10_1109_ACCESS_2020_3023599
crossref_primary_10_1117_1_JEI_27_4_043050
crossref_primary_10_1007_s11760_018_1354_1
crossref_primary_10_1016_j_jvcir_2020_102923
crossref_primary_10_1109_TCSVT_2021_3124562
crossref_primary_10_1016_j_patcog_2019_05_020
crossref_primary_10_3390_s20174946
crossref_primary_10_1016_j_jestch_2019_04_014
crossref_primary_10_1109_TCSVT_2024_3465845
crossref_primary_10_1007_s00371_019_01751_1
crossref_primary_10_1016_j_neucom_2020_04_034
crossref_primary_10_1109_TCSVT_2023_3285416
crossref_primary_10_1016_j_neucom_2019_11_048
crossref_primary_10_1007_s00500_023_07862_1
crossref_primary_10_1049_ipr2_12006
crossref_primary_10_1016_j_eswa_2024_124980
crossref_primary_10_1109_LSP_2023_3267975
crossref_primary_10_1109_TMM_2019_2897902
crossref_primary_10_1007_s00371_021_02167_6
crossref_primary_10_1049_iet_cvi_2018_5014
crossref_primary_10_3390_s24237609
crossref_primary_10_1109_ACCESS_2019_2949269
crossref_primary_10_1142_S0129065723500478
crossref_primary_10_1117_1_JRS_16_036503
crossref_primary_10_1016_j_patcog_2020_107293
crossref_primary_10_1016_j_neucom_2023_126827
crossref_primary_10_1109_TCSVT_2021_3100128
crossref_primary_10_1038_s41598_022_09293_8
crossref_primary_10_1109_JSEN_2018_2884443
crossref_primary_10_1109_TIP_2019_2913544
crossref_primary_10_1016_j_ijcce_2020_12_004
crossref_primary_10_1016_j_neucom_2025_129820
crossref_primary_10_1109_TCSVT_2020_2973301
crossref_primary_10_1007_s00521_019_04241_6
crossref_primary_10_1109_JSEN_2023_3316137
crossref_primary_10_1109_TCSVT_2021_3112214
crossref_primary_10_1109_TCSVT_2022_3217763
crossref_primary_10_3390_app12042006
crossref_primary_10_1109_LSP_2018_2841649
crossref_primary_10_4018_IJDCF_2018070101
crossref_primary_10_1109_TCSVT_2024_3410301
crossref_primary_10_1016_j_neunet_2020_10_015
crossref_primary_10_1109_TSMC_2018_2850149
crossref_primary_10_1109_LSP_2024_3356808
crossref_primary_10_1109_JSEN_2024_3491183
crossref_primary_10_1109_JBHI_2018_2819182
crossref_primary_10_3390_s23104899
crossref_primary_10_1109_TCSVT_2018_2879913
crossref_primary_10_32604_csse_2024_052931
crossref_primary_10_1109_ACCESS_2020_3032533
crossref_primary_10_1016_j_bspc_2023_105819
crossref_primary_10_3390_technologies8040055
crossref_primary_10_1007_s00521_020_05162_5
crossref_primary_10_1007_s11042_023_17345_y
crossref_primary_10_3390_s20113305
crossref_primary_10_1016_j_cviu_2024_104012
crossref_primary_10_1007_s11600_022_00871_y
crossref_primary_10_1109_TCSVT_2022_3194350
crossref_primary_10_1155_2022_2603939
crossref_primary_10_1109_ACCESS_2021_3111633
crossref_primary_10_1177_17298806221103708
crossref_primary_10_1007_s11042_019_08448_6
crossref_primary_10_1142_S0218001421500051
crossref_primary_10_3390_app13042058
crossref_primary_10_3390_s20102940
crossref_primary_10_1016_j_ins_2019_10_047
crossref_primary_10_1109_TCSVT_2023_3250646
crossref_primary_10_1155_2022_6230025
crossref_primary_10_2139_ssrn_4629098
crossref_primary_10_3390_app13158608
crossref_primary_10_1109_TCSVT_2020_2975065
crossref_primary_10_1109_ACCESS_2018_2889797
crossref_primary_10_1080_13682199_2023_2166193
Cites_doi 10.1109/DICTA.2014.7008101
10.1145/2207676.2208303
10.1016/j.jvcir.2013.04.007
10.1016/j.patcog.2016.05.019
10.1016/j.cviu.2016.04.005
10.1109/ICCV.2015.460
10.1109/CVPR.2011.5995316
10.1109/ICIP.2015.7350781
10.1109/ICPR.2014.451
10.1109/DICTA.2014.7008115
10.1109/CVPRW.2012.6239175
10.1145/2733373.2806296
10.1109/CVPRW.2013.153
10.1109/CVPR.2012.6247813
10.1109/CVPR.2016.484
10.1109/ICCV.2013.342
10.1109/CVPRW.2012.6239233
10.1016/j.cviu.2014.12.005
10.1109/THMS.2015.2504550
10.1109/CVPR.2014.82
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
DBID 97E
RIA
RIE
AAYXX
CITATION
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
DOI 10.1109/TCSVT.2016.2628339
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle CrossRef
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
DatabaseTitleList Technology Research Database

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1558-2205
EndPage 811
ExternalDocumentID 10_1109_TCSVT_2016_2628339
7742919
Genre orig-research
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 61571325
  funderid: 10.13039/501100001809
– fundername: Key Projects in the Tianjin Science and Technology Pillar Program
  grantid: 15ZCZD GX001900
GroupedDBID -~X
0R~
29I
4.4
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
HZ~
H~9
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
RIA
RIE
RNS
RXW
TAE
TN5
VH1
AAYXX
CITATION
RIG
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c361t-27bc37cb9a64df5fbb6dd8fce4e5c8563f70c1ea9c70ceea3060c6eea47e01983
IEDL.DBID RIE
ISSN 1051-8215
IngestDate Mon Jun 30 06:47:51 EDT 2025
Tue Jul 01 00:41:10 EDT 2025
Thu Apr 24 23:12:03 EDT 2025
Wed Aug 27 02:52:38 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 3
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c361t-27bc37cb9a64df5fbb6dd8fce4e5c8563f70c1ea9c70ceea3060c6eea47e01983
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-4427-2687
0000-0002-1430-0237
PQID 2174533062
PQPubID 85433
PageCount 5
ParticipantIDs crossref_primary_10_1109_TCSVT_2016_2628339
crossref_citationtrail_10_1109_TCSVT_2016_2628339
ieee_primary_7742919
proquest_journals_2174533062
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2018-03-01
PublicationDateYYYYMMDD 2018-03-01
PublicationDate_xml – month: 03
  year: 2018
  text: 2018-03-01
  day: 01
PublicationDecade 2010
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE transactions on circuits and systems for video technology
PublicationTitleAbbrev TCSVT
PublicationYear 2018
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
yang (ref2) 2012
ref12
ref15
ref14
shao (ref6) 2013
ref11
ref10
ref1
ref17
ref19
gowayyed (ref7) 2013
hussein (ref25) 2013
ref24
ref23
ref26
ref20
ref22
ref21
krizhevsky (ref18) 2012
ref8
ref9
ref4
ref3
ref5
du (ref16) 2015
References_xml – ident: ref24
  doi: 10.1109/DICTA.2014.7008101
– ident: ref19
  doi: 10.1145/2207676.2208303
– ident: ref9
  doi: 10.1016/j.jvcir.2013.04.007
– ident: ref22
  doi: 10.1016/j.patcog.2016.05.019
– ident: ref12
  doi: 10.1016/j.cviu.2016.04.005
– ident: ref17
  doi: 10.1109/ICCV.2015.460
– ident: ref1
  doi: 10.1109/CVPR.2011.5995316
– ident: ref21
  doi: 10.1109/ICIP.2015.7350781
– ident: ref23
  doi: 10.1109/ICPR.2014.451
– ident: ref11
  doi: 10.1109/DICTA.2014.7008115
– start-page: 14
  year: 2012
  ident: ref2
  article-title: EigenJoints-based action recognition using Naïve-Bayes-Nearest-Neighbor
  publication-title: Proc Int Workshop Human Activity Understand 3D Data (HAU3D)
– ident: ref20
  doi: 10.1109/CVPRW.2012.6239175
– start-page: 1351
  year: 2013
  ident: ref7
  article-title: Histogram of oriented displacements (HOD): Describing trajectories of human joints for action recognition
  publication-title: Proc Int Joint Conf Artif Intell
– ident: ref14
  doi: 10.1145/2733373.2806296
– ident: ref8
  doi: 10.1109/CVPRW.2013.153
– start-page: 1110
  year: 2015
  ident: ref16
  article-title: Hierarchical recurrent neural network for skeleton based action recognition
  publication-title: Proc IEEE Conf Comput Vis Pattern Recognit
– ident: ref3
  doi: 10.1109/CVPR.2012.6247813
– ident: ref13
  doi: 10.1109/CVPR.2016.484
– ident: ref5
  doi: 10.1109/ICCV.2013.342
– ident: ref4
  doi: 10.1109/CVPRW.2012.6239233
– start-page: 1106
  year: 2012
  ident: ref18
  article-title: Imagenet classification with deep convolutional neural networks
  publication-title: Proc Annu Conf Neural Inf Process Syst
– ident: ref26
  doi: 10.1016/j.cviu.2014.12.005
– ident: ref15
  doi: 10.1109/THMS.2015.2504550
– start-page: 4749
  year: 2013
  ident: ref6
  article-title: A new descriptor for multiple 3D motion trajectories recognition
  publication-title: Proc IEEE Int Conf Robot Autom
– start-page: 2466
  year: 2013
  ident: ref25
  article-title: Human action recognition using a temporal hierarchy of covariance descriptors on 3D joint locations
  publication-title: Proc Int Joint Conf Artif Intell
– ident: ref10
  doi: 10.1109/CVPR.2014.82
SSID ssj0014847
Score 2.6119206
Snippet This letter presents an effective method to encode the spatiotemporal information of a skeleton sequence into color texture images, referred to as skeleton...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 807
SubjectTerms Action recognition
Artificial neural networks
Body parts
Coding
Color texture
convolutional neural network (ConvNet)
Encoding
Feature recognition
Image coding
Image color analysis
Image recognition
Joints (anatomy)
Mapping
Neural networks
Skeleton
Training
Video data
Title Skeleton Optical Spectra-Based Action Recognition Using Convolutional Neural Networks
URI https://ieeexplore.ieee.org/document/7742919
https://www.proquest.com/docview/2174533062
Volume 28
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwELZKJxh4FUShoAxs4DRpHMcZS0VVIRUk2qJukV9ZitKqTRn49ZydhypAiMk32Jbls30Pf3eH0C3RMuSCCmzMD0xY6GHOpcJEKN-XHidUGdfA-JmOZuRpHs4b6L6OhdFaW_CZdg1p__LVUm6Nq6wLqkovNjk-98BwK2K16h8DwmwxMVAXfMxAjlUBMl7cnQ4mb1OD4qJuj4I4NYXBd4SQrary4ym28mV4hMbVygpYycLd5sKVn9-SNv536cfosFQ0nX5xMk5QQ2en6GAn_WALzSYLEDqg_DkvK-vSdkw1-nzN8QPINuX0bcyD81phjIC2CANnsMw-yiMLg0x-D9tYQPnmDM2Gj9PBCJdlFrAMqJ_jXiRkEEkRc0pUGqZCUKVYKjXRoWQhDdLIk77msYRWaw5GhicpECTSoCCy4Bw1s2WmL5ATmecqkJ6GKYiv01jwiMrAU0QxnsZpG_nVvieyzEFuSmG8J9YW8eLE8ioxvEpKXrXRXT1mVWTg-LN3y2x-3bPc9zbqVOxNyku6SYw1ZsC1tHf5-6grtA9zswJy1kHNfL3V16CD5OLGHr4vA7zaKA
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV1LT9wwEB5ROLQceJQilldzaE-VlzwcJzlwgAW0PCt1dytuwY_JBbSL2F0Q_Bb-Cv-NsZOsEEXckHqyD7GjeCaeb-xvZgB-cNSxVEIx634wnsY-k1IbxpUJAu1LLow9Gjg9E-0ePzqPz6fgcRILg4iOfIZN23V3-Wagx_aobIugSpgFWUWhPMb7O3LQhtuHeyTNn2F4sN9ttVlVQ4DpSAQjFiZKR4lWmRTcFHGhlDAmLTRyjHUai6hIfB2gzDS1iJIQtK8FdXiChH7SiOb9BDOEM-KwjA6b3FHw1JUvI4ASsJQsZx2S42db3Vbnb9fyxkQzFGTAbSnyF2bP1XH5Z_N3Fu1gHp7qtSiJLJfN8Ug19cOrNJH_62ItwFwFpb2dUvcXYQr7X2H2RYLFJeh1LsmsErz1fl-7Q3uvYyNLbyTbJettvB0X1eH9qVlU1HccCq816N9WPyUNshlMXOMo88Nv0PuQ71qG6f6gjyvgJXZDjrSPNAUPsMiUTISOfMNNKousaEBQyznXVZZ1W-zjKnfelp_lTjdyqxt5pRsN-DUZc13mGHn36SUr7MmTlZwbsF6rU15tQ8Pc-puWPizC1bdHfYfP7e7pSX5yeHa8Bl_oPWlJsFuH6dHNGDcIcY3UplN8Dy4-WnmeAXdrO2E
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Skeleton+Optical+Spectra-Based+Action+Recognition+Using+Convolutional+Neural+Networks&rft.jtitle=IEEE+transactions+on+circuits+and+systems+for+video+technology&rft.au=Hou%2C+Yonghong&rft.au=Li%2C+Zhaoyang&rft.au=Wang%2C+Pichao&rft.au=Li%2C+Wanqing&rft.date=2018-03-01&rft.issn=1051-8215&rft.eissn=1558-2205&rft.volume=28&rft.issue=3&rft.spage=807&rft.epage=811&rft_id=info:doi/10.1109%2FTCSVT.2016.2628339&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TCSVT_2016_2628339
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1051-8215&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1051-8215&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1051-8215&client=summon