Dancing-to-Music Character Animation

In computer graphics, considerable research has been conducted on realistic human motion synthesis. However, most research does not consider human emotional aspects, which often strongly affect human motion. This paper presents a new approach for synthesizing dance performance matched to input music...

Full description

Saved in:
Bibliographic Details
Published inComputer graphics forum Vol. 25; no. 3; pp. 449 - 458
Main Authors Shiratori, Takaaki, Nakazawa, Atsushi, Ikeuchi, Katsushi
Format Journal Article
LanguageEnglish
Published Oxford, UK and Boston, USA Blackwell Publishing, Inc 01.09.2006
Blackwell Publishing Ltd
Subjects
Online AccessGet full text

Cover

Loading…
Abstract In computer graphics, considerable research has been conducted on realistic human motion synthesis. However, most research does not consider human emotional aspects, which often strongly affect human motion. This paper presents a new approach for synthesizing dance performance matched to input music, based on the emotional aspects of dance performance. Our method consists of a motion analysis, a music analysis, and a motion synthesis based on the extracted features. In the analysis steps, motion and music feature vectors are acquired. Motion vectors are derived from motion rhythm and intensity, while music vectors are derived from musical rhythm, structure, and intensity. For synthesizing dance performance, we first find candidate motion segments whose rhythm features are matched to those of each music segment, and then we find the motion segment set whose intensity is similar to that of music segments. Additionally, our system supports having animators control the synthesis process by assigning desired motion segments to the specified music segments. The experimental results indicate that our method actually creates dance performance as if a character was listening and expressively dancing to the music. Categories and Subject Descriptors (according to ACM CCS): I.3.7 [Computer Graphics]: Three‐Dimensional Graphics and Realism Animation; J.5 [Arts and Humanities]: Performing Arts Music
AbstractList In computer graphics, considerable research has been conducted on realistic human motion synthesis. However, most research does not consider human emotional aspects, which often strongly affect human motion. This paper presents a new approach for synthesizing dance performance matched to input music, based on the emotional aspects of dance performance. Our method consists of a motion analysis, a music analysis, and a motion synthesis based on the extracted features. In the analysis steps, motion and music feature vectors are acquired. Motion vectors are derived from motion rhythm and intensity, while music vectors are derived from musical rhythm, structure, and intensity. For synthesizing dance performance, we first find candidate motion segments whose rhythm features are matched to those of each music segment, and then we find the motion segment set whose intensity is similar to that of music segments. Additionally, our system supports having animators control the synthesis process by assigning desired motion segments to the specified music segments. The experimental results indicate that our method actually creates dance performance as if a character was listening and expressively dancing to the music.
In computer graphics, considerable research has been conducted on realistic human motion synthesis. However, most research does not consider human emotional aspects, which often strongly affect human motion. This paper presents a new approach for synthesizing dance performance matched to input music, based on the emotional aspects of dance performance. Our method consists of a motion analysis, a music analysis, and a motion synthesis based on the extracted features. In the analysis steps, motion and music feature vectors are acquired. Motion vectors are derived from motion rhythm and intensity, while music vectors are derived from musical rhythm, structure, and intensity. For synthesizing dance performance, we first find candidate motion segments whose rhythm features are matched to those of each music segment, and then we find the motion segment set whose intensity is similar to that of music segments. Additionally, our system supports having animators control the synthesis process by assigning desired motion segments to the specified music segments. The experimental results indicate that our method actually creates dance performance as if a character was listening and expressively dancing to the music. Categories and Subject Descriptors (according to ACM CCS): I.3.7 [Computer Graphics]: Three‐Dimensional Graphics and Realism Animation; J.5 [Arts and Humanities]: Performing Arts Music
In computer graphics, considerable research has been conducted on realistic human motion synthesis. However, most research does not consider human emotional aspects, which often strongly affect human motion. This paper presents a new approach for synthesizing dance performance matched to input music, based on the emotional aspects of dance performance. Our method consists of a motion analysis, a music analysis, and a motion synthesis based on the extracted features. In the analysis steps, motion and music feature vectors are acquired. Motion vectors are derived from motion rhythm and intensity, while music vectors are derived from musical rhythm, structure, and intensity. For synthesizing dance performance, we first find candidate motion segments whose rhythm features are matched to those of each music segment, and then we find the motion segment set whose intensity is similar to that of music segments. Additionally, our system supports having animators control the synthesis process by assigning desired motion segments to the specified music segments. The experimental results indicate that our method actually creates dance performance as if a character was listening and expressively dancing to the music. [PUBLICATION ABSTRACT]
Author Nakazawa, Atsushi
Shiratori, Takaaki
Ikeuchi, Katsushi
Author_xml – sequence: 1
  givenname: Takaaki
  surname: Shiratori
  fullname: Shiratori, Takaaki
  organization: Institute of Industrial Science, The University of Tokyo, Japan
– sequence: 2
  givenname: Atsushi
  surname: Nakazawa
  fullname: Nakazawa, Atsushi
  organization: Cybermedia Center, Osaka University, Japan
– sequence: 3
  givenname: Katsushi
  surname: Ikeuchi
  fullname: Ikeuchi, Katsushi
  organization: Institute of Industrial Science, The University of Tokyo, Japan
BookMark eNqNkE1LxDAQhoMouK7-h0XEW2vSJpkERJCqq-LHxY_jEGOqWWurSRfXf2_qigdPzmUG8r7vZJ4Nstp2rSNkwmjOUu3NcsYlZEoKnReUypxSLXm-WCGj34dVMqIszUCFWCcbMc4opRykGJGdI9Na3z5lfZddzqO3k-rZBGN7FyaHrX81ve_aTbJWmya6rZ8-JrcnxzfVaXZxPT2rDi8yy4HzrHBOgeNGCatVAco9GmqgqMsahGGiMGCs1FoqV5dlyXRdA2PCAa0ZPGhjyzHZXea-he597mKPrz5a1zSmdd08YqGl4IIVSbj9Rzjr5qFNf0OmebqYqkGkliIbuhiDq_EtpIPCJzKKAzuc4YAIB0Q4sMNvdrhI1oM_Vuv7bxR9ML75T8D-MuDDN-7z34uxmp6kIdmzpd3H3i1-7Sa8oIQSBN5fTVEBhVO4O8eq_AK4Fpby
CitedBy_id crossref_primary_10_1007_s41095_018_0115_y
crossref_primary_10_1109_TPAMI_2023_3330935
crossref_primary_10_1016_j_robot_2016_09_012
crossref_primary_10_15701_kcgs_2012_18_2_35
crossref_primary_10_1007_s41095_023_0343_7
crossref_primary_10_1145_3414685_3417774
crossref_primary_10_1109_TMM_2011_2181492
crossref_primary_10_1109_TVCG_2011_73
crossref_primary_10_1145_3344383
crossref_primary_10_3724_SP_J_1089_2010_10874
crossref_primary_10_1007_s41095_022_0292_6
crossref_primary_10_1007_s00371_017_1452_z
crossref_primary_10_1002_cav_1701
crossref_primary_10_1002_cav_267
crossref_primary_10_1109_TRO_2014_2300212
crossref_primary_10_1109_TVCG_2022_3163676
crossref_primary_10_1186_1687_4722_2012_18
crossref_primary_10_1109_ACCESS_2021_3057486
crossref_primary_10_1109_TMM_2024_3390232
crossref_primary_10_1145_1461999_1462005
crossref_primary_10_1155_2022_1455849
crossref_primary_10_1007_s11042_012_1288_5
crossref_primary_10_1109_TLT_2010_27
crossref_primary_10_3756_artsci_22_9_1
crossref_primary_10_1007_s00138_023_01399_x
crossref_primary_10_1007_s00371_015_1116_9
crossref_primary_10_1111_cgf_14402
crossref_primary_10_1007_s00521_013_1504_x
crossref_primary_10_1111_cgf_12598
crossref_primary_10_1587_transinf_E95_D_1646
crossref_primary_10_3169_itej_64_577
crossref_primary_10_1002_cav_314
crossref_primary_10_1109_TVCG_2021_3115902
crossref_primary_10_1007_s00371_013_0839_8
crossref_primary_10_1145_2755566
crossref_primary_10_1002_cav_1714
crossref_primary_10_1109_THMS_2015_2393558
crossref_primary_10_1145_1596990_1596991
crossref_primary_10_1002_cav_397
crossref_primary_10_1007_s11042_020_09192_y
crossref_primary_10_1016_j_vrih_2024_06_004
crossref_primary_10_1186_1687_6180_2012_35
Cites_doi 10.20965/jrm.2002.p0027
10.1145/1073204.1073315
10.1145/1073204.1073247
10.1145/566654.566606
10.1080/09298210008565462
10.1121/1.421129
10.1145/218380.218422
10.1076/jnmr.30.2.159.7114
10.1145/566654.566608
10.2307/3680012
10.2307/3680495
10.1145/311535.311539
10.1121/1.400476
10.1145/566654.566604
10.1145/1015706.1015753
10.1109/TCOM.1967.1089532
10.1145/1015706.1015756
10.1109/AFGR.2004.1301641
10.1080/09298219408570647
10.1145/566654.566605
10.1145/280814.280820
10.7551/mitpress/1486.001.0001
10.1145/882262.882283
10.1145/1015706.1015755
10.1145/218380.218421
10.1145/566654.566607
10.1016/0167-6393(93)90037-L
10.1109/TSA.2004.841053
ContentType Journal Article
Copyright The Eurographics Association and Blackwell Publishing 2006
Copyright_xml – notice: The Eurographics Association and Blackwell Publishing 2006
DBID BSCLL
AAYXX
CITATION
7SC
8FD
JQ2
L7M
L~C
L~D
F28
FR3
DOI 10.1111/j.1467-8659.2006.00964.x
DatabaseName Istex
CrossRef
Computer and Information Systems Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
DatabaseTitle CrossRef
Computer and Information Systems Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Advanced Technologies Database with Aerospace
ProQuest Computer Science Collection
Computer and Information Systems Abstracts Professional
Engineering Research Database
ANTE: Abstracts in New Technology & Engineering
DatabaseTitleList Technology Research Database
CrossRef

Computer and Information Systems Abstracts
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Dance
Music
EISSN 1467-8659
EndPage 458
ExternalDocumentID 1175633541
10_1111_j_1467_8659_2006_00964_x
CGF964
ark_67375_WNG_8707H7VJ_C
Genre article
Feature
GroupedDBID .3N
.4S
.DC
.GA
.Y3
05W
0R~
10A
15B
1OB
1OC
29F
31~
33P
3SF
4.4
50Y
50Z
51W
51X
52M
52N
52O
52P
52S
52T
52U
52W
52X
5GY
5HH
5LA
5VS
66C
6J9
702
7PT
8-0
8-1
8-3
8-4
8-5
8UM
8VB
930
A03
AAESR
AAEVG
AAHHS
AANLZ
AAONW
AASGY
AAXRX
AAZKR
ABCQN
ABCUV
ABDBF
ABDPE
ABEML
ABPVW
ACAHQ
ACBWZ
ACCFJ
ACCZN
ACFBH
ACGFS
ACPOU
ACSCC
ACXBN
ACXQS
ADBBV
ADEOM
ADIZJ
ADKYN
ADMGS
ADOZA
ADXAS
ADZMN
ADZOD
AEEZP
AEGXH
AEIGN
AEIMD
AEMOZ
AENEX
AEQDE
AEUQT
AEUYR
AFBPY
AFEBI
AFFNX
AFFPM
AFGKR
AFPWT
AFZJQ
AHBTC
AHEFC
AITYG
AIURR
AIWBW
AJBDE
AJXKR
AKVCP
ALAGY
ALMA_UNASSIGNED_HOLDINGS
ALUQN
AMBMR
AMYDB
ARCSS
ASPBG
ATUGU
AUFTA
AVWKF
AZBYB
AZFZN
AZVAB
BAFTC
BDRZF
BFHJK
BHBCM
BMNLL
BMXJE
BNHUX
BROTX
BRXPI
BSCLL
BY8
CAG
COF
CS3
CWDTD
D-E
D-F
DCZOG
DPXWK
DR2
DRFUL
DRSTM
DU5
EAD
EAP
EBA
EBO
EBR
EBS
EBU
EDO
EJD
EMK
EST
ESX
F00
F01
F04
F5P
FEDTE
FZ0
G-S
G.N
GODZA
H.T
H.X
HF~
HGLYW
HVGLF
HZI
HZ~
I-F
IHE
IX1
J0M
K1G
K48
LATKE
LC2
LC3
LEEKS
LH4
LITHE
LOXES
LP6
LP7
LUTES
LW6
LYRES
MEWTI
MK4
MRFUL
MRSTM
MSFUL
MSSTM
MXFUL
MXSTM
N04
N05
N9A
NF~
O66
O9-
OIG
P2W
P2X
P4D
PALCI
PQQKQ
Q.N
Q11
QB0
QWB
R.K
RDJ
RIWAO
RJQFR
ROL
RX1
SAMSI
SUPJJ
TH9
TN5
TUS
UB1
V8K
W8V
W99
WBKPD
WIH
WIK
WOHZO
WQJ
WRC
WXSBR
WYISQ
WZISG
XG1
ZL0
ZZTAW
~IA
~IF
~WT
AAHQN
AAMMB
AAMNL
AANHP
AAYCA
ACRPL
ACUHS
ACYXJ
ADMLS
ADNMO
AEFGJ
AEYWJ
AFWVQ
AGHNM
AGQPQ
AGXDD
AGYGG
AHQJS
AIDQK
AIDYY
ALVPJ
AAYXX
CITATION
7SC
8FD
JQ2
L7M
L~C
L~D
F28
FR3
ID FETCH-LOGICAL-c4744-2ee87e4a85c98278eda0a72f3f75a152a7ac69968ef33319ff7115e70f17b9ac3
IEDL.DBID DR2
ISSN 0167-7055
IngestDate Thu Jul 10 22:44:20 EDT 2025
Fri Jul 25 03:19:29 EDT 2025
Tue Jul 01 02:22:51 EDT 2025
Thu Apr 24 22:54:14 EDT 2025
Wed Aug 20 07:24:38 EDT 2025
Wed Oct 30 09:57:32 EDT 2024
IsPeerReviewed true
IsScholarly true
Issue 3
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c4744-2ee87e4a85c98278eda0a72f3f75a152a7ac69968ef33319ff7115e70f17b9ac3
Notes ArticleID:CGF964
ark:/67375/WNG-8707H7VJ-C
istex:FF6F6283403D478C4CEC8797A1A44C4846F74F0E
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
PQID 194659082
PQPubID 30877
PageCount 10
ParticipantIDs proquest_miscellaneous_29654512
proquest_journals_194659082
crossref_primary_10_1111_j_1467_8659_2006_00964_x
crossref_citationtrail_10_1111_j_1467_8659_2006_00964_x
wiley_primary_10_1111_j_1467_8659_2006_00964_x_CGF964
istex_primary_ark_67375_WNG_8707H7VJ_C
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate September 2006
PublicationDateYYYYMMDD 2006-09-01
PublicationDate_xml – month: 09
  year: 2006
  text: September 2006
PublicationDecade 2000
PublicationPlace Oxford, UK and Boston, USA
PublicationPlace_xml – name: Oxford, UK and Boston, USA
– name: Oxford
PublicationTitle Computer graphics forum
PublicationYear 2006
Publisher Blackwell Publishing, Inc
Blackwell Publishing Ltd
Publisher_xml – name: Blackwell Publishing, Inc
– name: Blackwell Publishing Ltd
References CHEW E.: Modeling tonality: Applications to music cognition. In Proc. Annual Conf. of the Cognitive Science Society (2001), pp. 206-211.
GOTO M.: An audio-based real-time beat tracking system for music with or without drum-sounds. Journal of New Music Research 30, 2 (2001), 159-171.
HSUE. , GENTRY S., POPOVI C´J.: Example-based control of human motion. In Proc. SIGGRAPH/Eurographics Symposium on Computer Animation 2004 (2004), pp. 69-77.
GROCHOW K., MARTIN S. L., HERTZMANN A., POPOVIĆ Z.: Style-based inverse kinematics. ACM Trans. on Graphics 23, 3 (2004), 522-531.
KAILATH T.: The divergence and Bhattacharyya distance measures in signal selection. IEEE Trans. on Communication Technology COM-15 (1967), 52-60.
ARIKAN O., FORSYTH D. A.: Interactive motion generation from examples. ACM Trans. on Graphics 21, 3 (2002), 483-490.
STONE M., DECARLO D., OH I., RODRIGUEZ C., STERE A., LEES A., BREGLER C.: Speaking with hands: Creating animated conversational characters from recordings of human performance. ACM Trans. on Graphics 23, 3 (2004), 506-513.
BRAND M.E., HERTZMANN A.: Style machines. ACM Trans. on Graphics 22, 3 (2000), 402-408.
BREGMAN A.S.: Auditory Scene Analysis: The Perceptual Organization of sound. The MIT Press, 1990.
LU L., ZHANG H. -J.: Automated extraction of music snippets. In Proc. ACM Multimedia (2003), pp. 140-147.
NAKATA T., MORI T., SATO T.: Analysis of impression of robot bodily expression. Journal of Robotics and Mechatoronics 14, 1 (2002), 27-36.
KIM T., PARK S. I., SHIN S. Y.: Rhythmic-motion synthesis based on motion-beat analysis. ACM Trans. on Graphics 22, 3 (2003), 392-401.
SCHERIER E.D.: Tempo and beat analysis of acoustic musical signals. Journal of the Acoustic Society of America 103, 1 (1998), 588-601.
HSU E., PULLI K., POPOVIĆ J.: Style translation for human motion. ACM Trans. on Graphics 24, 3 (2005), 1082-1089.
KOVAR L., GLEICHER M., PIGHIN F.: Motion graphs. ACM Trans. on Graphics 21, 3 (2002), 473-482.
WITKIN A., POPOVIĆ Z.: Motion warping. In Proc. ACM SIGGRAPH 95 (1995), pp. 105-108.
LOGAN B., CHU S.: Music summarization using key phrases. In Proc. IEEE Int'l Conf. on Acoustics, Speech, and Signal Processing (2000).
LABAN R., ULLMANN L.: Mastery of Movement. Princeton Book Company Publishers, 1960.
PULLEN K., BREGLER C.: Motion capture assisted animation: Texturing and synthesis. ACM Trans. on Graphics 21, 3 (2002), 501-508.
LEE J., CHAI J., REITSMA P. S. A., HODGINS J. K., POLLARD N. S.: Interactive control of avatars animated with human motion data. ACM Trans. on Graphics 21, 3 (2002), 491-500.
DESAIN P., HONING H.: The quantization of musical time: A connectionist approach. Computer Music Journal 13, 3 (1989), 56-66.
YAMANE K., KUFFNER J., HODGINS J.K.: Synthesizing animations of human manipulation tasks. ACM Trans. on Graphics 23, 3 (2004), 532-539.
BRUDERLIN A., WILLIAMS L.: Motion signal processing. In Proc. ACM SIGGRAPH 95 (1995), pp. 97-104.
MÜLLER M., RÖDER T., CLAUSEN M.: Efficient content-based retrieval of motion capture data. ACM Trans. on Graphics 24, 3 (2005), 677-685.
COOKE M., BROWN G.: Computational auditory scene analysis: Exploiting principles of perceived continuity. Speech Communication 13 (1993), 391-399.
CEMGIL A. T., KAPPEN B., DESIAN P., HONING H.: On tempo tracking: Tempogram representation and kalman filtering. Journal of New Music Research 29, 4 (2001), 259-273.
SETHARES W. A., MORRIS R. D., SETHARES J. C.: Beat tracking of musical performance using low-level audio features. IEEE Trans. on Speech and Audio Processing 13, 2 (2005).
LI Y., WANG T., SHUM H. -Y.: Motion texture: a two-level statistical model for character motion synthesis. ACM Trans. on Graphics 21, 3 (2002), 465-472.
GLEICHER M.: Retargetting moiton to new characters. In Proc. ACM SIGGRAPH 98 (1998), pp. 33-42.
SHAO X., XU C., WANG Y., SKANKANHALLI M.: Automatic music summarization in compressed domain. In Proc. IEEE Int'l Conf. on Acoustics, Speech, and Signal Processing (2004).
WANG M., LU L., ZHANG H. -J.: Repeating pattern discovery from acoustic musical signals. In Proc. IEEE Int'l Conf. on Multimedia and Expo (2004), pp. 2019-2022.
LEE J., SHIN S. Y.: A hierarchical approach to interactive motion editing for human-like figures. In Proc. ACM SIGGRAPH 99 (1999), pp. 39-48.
ROADS C.: The Computer Music Tutorial. The MIT Press, 1996.
SHIRATORI T., NAKAZAWA A., IKEUCHI K.: Detecting dance motion structure through music analysis. In Proc. IEEE Int'l Conf. on Automatic Face and Gesture Recognition (2004), pp. 857-862.
NAVA G. P., TANAKA H.: Finding music beats and tempo by using an image processing technique. In Proc. Int'l Conf. on Information Technology for Application (2004).
TODD N. P. M.: The auditory primal sketch: A multi-scale model of rhythmic group. Journal of New Music Research 23, 1 (1994), 25-70.
BROWN J.C.: Calculation of a constant Q spectral transform. Journal of Acoustic Society of America 89, 1 (1990), 425-434.
2002; 14
2000; 22
2004; 23
1998
1994; 23
1996
1995
2004
2003
1992
2001; 29
2005; 24
1999
1993; 13
1990; 89
1990
2001
2000
2002; 21
1960
1998; 103
1989; 13
2001; 30
1967
2003; 22
2005; 13
e_1_2_12_2_2
e_1_2_12_5_2
e_1_2_12_4_2
e_1_2_12_18_2
e_1_2_12_16_2
e_1_2_12_15_2
e_1_2_12_38_2
e_1_2_12_39_2
NAKATA T. (e_1_2_12_26_2) 2002; 14
WANG M. (e_1_2_12_37_2) 2004
e_1_2_12_20_2
e_1_2_12_21_2
SHAO X. (e_1_2_12_35_2) 2004
e_1_2_12_23_2
e_1_2_12_25_2
LABAN R. (e_1_2_12_22_2) 1960
LOGAN B. (e_1_2_12_19_2) 2000
HSUE. (e_1_2_12_14_2)
CHEW E. (e_1_2_12_8_2) 2001
NAVA G. P. (e_1_2_12_27_2) 2004
e_1_2_12_28_2
e_1_2_12_30_2
e_1_2_12_31_2
e_1_2_12_32_2
e_1_2_12_33_2
e_1_2_12_34_2
ROADS C. (e_1_2_12_29_2) 1996
e_1_2_12_36_2
BRAND M.E. (e_1_2_12_3_2) 2000; 22
e_1_2_12_13_2
e_1_2_12_12_2
e_1_2_12_11_2
LU L. (e_1_2_12_24_2) 2003
e_1_2_12_7_2
e_1_2_12_10_2
e_1_2_12_6_2
KOVAR L. (e_1_2_12_17_2) 2002; 21
e_1_2_12_9_2
References_xml – reference: BRUDERLIN A., WILLIAMS L.: Motion signal processing. In Proc. ACM SIGGRAPH 95 (1995), pp. 97-104.
– reference: SETHARES W. A., MORRIS R. D., SETHARES J. C.: Beat tracking of musical performance using low-level audio features. IEEE Trans. on Speech and Audio Processing 13, 2 (2005).
– reference: BREGMAN A.S.: Auditory Scene Analysis: The Perceptual Organization of sound. The MIT Press, 1990.
– reference: GLEICHER M.: Retargetting moiton to new characters. In Proc. ACM SIGGRAPH 98 (1998), pp. 33-42.
– reference: GOTO M.: An audio-based real-time beat tracking system for music with or without drum-sounds. Journal of New Music Research 30, 2 (2001), 159-171.
– reference: KOVAR L., GLEICHER M., PIGHIN F.: Motion graphs. ACM Trans. on Graphics 21, 3 (2002), 473-482.
– reference: COOKE M., BROWN G.: Computational auditory scene analysis: Exploiting principles of perceived continuity. Speech Communication 13 (1993), 391-399.
– reference: LEE J., CHAI J., REITSMA P. S. A., HODGINS J. K., POLLARD N. S.: Interactive control of avatars animated with human motion data. ACM Trans. on Graphics 21, 3 (2002), 491-500.
– reference: SCHERIER E.D.: Tempo and beat analysis of acoustic musical signals. Journal of the Acoustic Society of America 103, 1 (1998), 588-601.
– reference: LOGAN B., CHU S.: Music summarization using key phrases. In Proc. IEEE Int'l Conf. on Acoustics, Speech, and Signal Processing (2000).
– reference: LU L., ZHANG H. -J.: Automated extraction of music snippets. In Proc. ACM Multimedia (2003), pp. 140-147.
– reference: ARIKAN O., FORSYTH D. A.: Interactive motion generation from examples. ACM Trans. on Graphics 21, 3 (2002), 483-490.
– reference: CEMGIL A. T., KAPPEN B., DESIAN P., HONING H.: On tempo tracking: Tempogram representation and kalman filtering. Journal of New Music Research 29, 4 (2001), 259-273.
– reference: NAKATA T., MORI T., SATO T.: Analysis of impression of robot bodily expression. Journal of Robotics and Mechatoronics 14, 1 (2002), 27-36.
– reference: DESAIN P., HONING H.: The quantization of musical time: A connectionist approach. Computer Music Journal 13, 3 (1989), 56-66.
– reference: HSU E., PULLI K., POPOVIĆ J.: Style translation for human motion. ACM Trans. on Graphics 24, 3 (2005), 1082-1089.
– reference: WITKIN A., POPOVIĆ Z.: Motion warping. In Proc. ACM SIGGRAPH 95 (1995), pp. 105-108.
– reference: HSUE. , GENTRY S., POPOVI C´J.: Example-based control of human motion. In Proc. SIGGRAPH/Eurographics Symposium on Computer Animation 2004 (2004), pp. 69-77.
– reference: TODD N. P. M.: The auditory primal sketch: A multi-scale model of rhythmic group. Journal of New Music Research 23, 1 (1994), 25-70.
– reference: ROADS C.: The Computer Music Tutorial. The MIT Press, 1996.
– reference: YAMANE K., KUFFNER J., HODGINS J.K.: Synthesizing animations of human manipulation tasks. ACM Trans. on Graphics 23, 3 (2004), 532-539.
– reference: NAVA G. P., TANAKA H.: Finding music beats and tempo by using an image processing technique. In Proc. Int'l Conf. on Information Technology for Application (2004).
– reference: PULLEN K., BREGLER C.: Motion capture assisted animation: Texturing and synthesis. ACM Trans. on Graphics 21, 3 (2002), 501-508.
– reference: LI Y., WANG T., SHUM H. -Y.: Motion texture: a two-level statistical model for character motion synthesis. ACM Trans. on Graphics 21, 3 (2002), 465-472.
– reference: SHAO X., XU C., WANG Y., SKANKANHALLI M.: Automatic music summarization in compressed domain. In Proc. IEEE Int'l Conf. on Acoustics, Speech, and Signal Processing (2004).
– reference: LABAN R., ULLMANN L.: Mastery of Movement. Princeton Book Company Publishers, 1960.
– reference: GROCHOW K., MARTIN S. L., HERTZMANN A., POPOVIĆ Z.: Style-based inverse kinematics. ACM Trans. on Graphics 23, 3 (2004), 522-531.
– reference: MÜLLER M., RÖDER T., CLAUSEN M.: Efficient content-based retrieval of motion capture data. ACM Trans. on Graphics 24, 3 (2005), 677-685.
– reference: BROWN J.C.: Calculation of a constant Q spectral transform. Journal of Acoustic Society of America 89, 1 (1990), 425-434.
– reference: CHEW E.: Modeling tonality: Applications to music cognition. In Proc. Annual Conf. of the Cognitive Science Society (2001), pp. 206-211.
– reference: KIM T., PARK S. I., SHIN S. Y.: Rhythmic-motion synthesis based on motion-beat analysis. ACM Trans. on Graphics 22, 3 (2003), 392-401.
– reference: BRAND M.E., HERTZMANN A.: Style machines. ACM Trans. on Graphics 22, 3 (2000), 402-408.
– reference: LEE J., SHIN S. Y.: A hierarchical approach to interactive motion editing for human-like figures. In Proc. ACM SIGGRAPH 99 (1999), pp. 39-48.
– reference: STONE M., DECARLO D., OH I., RODRIGUEZ C., STERE A., LEES A., BREGLER C.: Speaking with hands: Creating animated conversational characters from recordings of human performance. ACM Trans. on Graphics 23, 3 (2004), 506-513.
– reference: WANG M., LU L., ZHANG H. -J.: Repeating pattern discovery from acoustic musical signals. In Proc. IEEE Int'l Conf. on Multimedia and Expo (2004), pp. 2019-2022.
– reference: SHIRATORI T., NAKAZAWA A., IKEUCHI K.: Detecting dance motion structure through music analysis. In Proc. IEEE Int'l Conf. on Automatic Face and Gesture Recognition (2004), pp. 857-862.
– reference: KAILATH T.: The divergence and Bhattacharyya distance measures in signal selection. IEEE Trans. on Communication Technology COM-15 (1967), 52-60.
– volume: 103
  start-page: 588
  issue: 1
  year: 1998
  end-page: 601
  article-title: Tempo and beat analysis of acoustic musical signals
  publication-title: Journal of the Acoustic Society of America
– year: 1960
– volume: 89
  start-page: 425
  issue: 1
  year: 1990
  end-page: 434
  article-title: Calculation of a constant Q spectral transform
  publication-title: Journal of Acoustic Society of America
– volume: 23
  start-page: 532
  issue: 3
  year: 2004
  end-page: 539
  article-title: Synthesizing animations of human manipulation tasks
  publication-title: ACM Trans. on Graphics
– year: 2004
  article-title: Finding music beats and tempo by using an image processing technique
  publication-title: Proc. Int'l Conf. on Information Technology for Application
– volume: 29
  start-page: 259
  issue: 4
  year: 2001
  end-page: 273
  article-title: On tempo tracking: Tempogram representation and kalman filtering
  publication-title: Journal of New Music Research
– start-page: 97
  year: 1995
  end-page: 104
  article-title: Motion signal processing
  publication-title: Proc. ACM SIGGRAPH 95
– volume: 30
  start-page: 159
  issue: 2
  year: 2001
  end-page: 171
  article-title: An audio‐based real‐time beat tracking system for music with or without drum‐sounds
  publication-title: Journal of New Music Research
– volume: 21
  start-page: 473
  issue: 3
  year: 2002
  end-page: 482
  article-title: Motion graphs
  publication-title: ACM Trans. on Graphics
– year: 2000
  article-title: Music summarization using key phrases
  publication-title: Proc. IEEE Int'l Conf. on Acoustics, Speech, and Signal Processing
– volume: 21
  start-page: 491
  issue: 3
  year: 2002
  end-page: 500
  article-title: Interactive control of avatars animated with human motion data
  publication-title: ACM Trans. on Graphics
– year: 1996
– start-page: 69
  end-page: 77
  article-title: Example‐based control of human motion
  publication-title: Proc. SIGGRAPH/Eurographics Symposium on Computer Animation 2004
– year: 2004
  article-title: Automatic music summarization in compressed domain
  publication-title: Proc. IEEE Int'l Conf. on Acoustics, Speech, and Signal Processing
– start-page: 857
  year: 2004
  end-page: 862
  article-title: Detecting dance motion structure through music analysis
  publication-title: Proc. IEEE Int'l Conf. on Automatic Face and Gesture Recognition
– volume: 21
  start-page: 483
  issue: 3
  year: 2002
  end-page: 490
  article-title: Interactive motion generation from examples
  publication-title: ACM Trans. on Graphics
– volume: 13
  start-page: 391
  year: 1993
  end-page: 399
  article-title: Computational auditory scene analysis: Exploiting principles of perceived continuity
  publication-title: Speech Communication
– year: 1990
– year: 1992
– start-page: 2019
  year: 2004
  end-page: 2022
  article-title: Repeating pattern discovery from acoustic musical signals
  publication-title: Proc. IEEE Int'l Conf. on Multimedia and Expo
– volume: 23
  start-page: 506
  issue: 3
  year: 2004
  end-page: 513
  article-title: Speaking with hands: Creating animated conversational characters from recordings of human performance
  publication-title: ACM Trans. on Graphics
– volume: 22
  start-page: 392
  issue: 3
  year: 2003
  end-page: 401
  article-title: Rhythmic‐motion synthesis based on motion‐beat analysis
  publication-title: ACM Trans. on Graphics
– volume: 22
  start-page: 402
  issue: 3
  year: 2000
  end-page: 408
  article-title: Style machines
  publication-title: ACM Trans. on Graphics
– volume: 13
  start-page: 56
  issue: 3
  year: 1989
  end-page: 66
  article-title: The quantization of musical time: A connectionist approach
  publication-title: Computer Music Journal
– volume: 23
  start-page: 522
  issue: 3
  year: 2004
  end-page: 531
  article-title: Style‐based inverse kinematics
  publication-title: ACM Trans. on Graphics
– volume: 24
  start-page: 1082
  issue: 3
  year: 2005
  end-page: 1089
  article-title: Style translation for human motion
  publication-title: ACM Trans. on Graphics
– volume: 13
  issue: 2
  year: 2005
  article-title: Beat tracking of musical performance using low‐level audio features
  publication-title: IEEE Trans. on Speech and Audio Processing
– start-page: 206
  year: 2001
  end-page: 211
  article-title: Modeling tonality: Applications to music cognition
  publication-title: Proc. Annual Conf. of the Cognitive Science Society
– volume: 14
  start-page: 27
  issue: 1
  year: 2002
  end-page: 36
  article-title: Analysis of impression of robot bodily expression
  publication-title: Journal of Robotics and Mechatoronics
– start-page: 39
  year: 1999
  end-page: 48
  article-title: A hierarchical approach to interactive motion editing for human‐like figures
  publication-title: Proc. ACM SIGGRAPH 99
– start-page: 52
  year: 1967
  end-page: 60
  article-title: The divergence and Bhattacharyya distance measures in signal selection
  publication-title: IEEE Trans. on Communication Technology COM-15
– start-page: 105
  year: 1995
  end-page: 108
  article-title: Motion warping
  publication-title: Proc. ACM SIGGRAPH 95
– start-page: 33
  year: 1998
  end-page: 42
  article-title: Retargetting moiton to new characters
  publication-title: Proc. ACM SIGGRAPH 98
– volume: 23
  start-page: 25
  issue: 1
  year: 1994
  end-page: 70
  article-title: The auditory primal sketch: A multi‐scale model of rhythmic group
  publication-title: Journal of New Music Research
– volume: 21
  start-page: 465
  issue: 3
  year: 2002
  end-page: 472
  article-title: Motion texture: a two‐level statistical model for character motion synthesis
  publication-title: ACM Trans. on Graphics
– start-page: 140
  year: 2003
  end-page: 147
  article-title: Automated extraction of music snippets
  publication-title: Proc. ACM Multimedia
– volume: 24
  start-page: 677
  issue: 3
  year: 2005
  end-page: 685
  article-title: Efficient content‐based retrieval of motion capture data
  publication-title: ACM Trans. on Graphics
– volume: 21
  start-page: 501
  issue: 3
  year: 2002
  end-page: 508
  article-title: Motion capture assisted animation: Texturing and synthesis
  publication-title: ACM Trans. on Graphics
– volume: 14
  start-page: 27
  issue: 1
  year: 2002
  ident: e_1_2_12_26_2
  article-title: Analysis of impression of robot bodily expression
  publication-title: Journal of Robotics and Mechatoronics
  doi: 10.20965/jrm.2002.p0027
– ident: e_1_2_12_15_2
  doi: 10.1145/1073204.1073315
– ident: e_1_2_12_25_2
  doi: 10.1145/1073204.1073247
– ident: e_1_2_12_2_2
  doi: 10.1145/566654.566606
– ident: e_1_2_12_9_2
  doi: 10.1080/09298210008565462
– start-page: 69
  ident: e_1_2_12_14_2
  article-title: Example‐based control of human motion
  publication-title: Proc. SIGGRAPH/Eurographics Symposium on Computer Animation 2004
– volume-title: Mastery of Movement
  year: 1960
  ident: e_1_2_12_22_2
– start-page: 2019
  year: 2004
  ident: e_1_2_12_37_2
  article-title: Repeating pattern discovery from acoustic musical signals
  publication-title: Proc. IEEE Int'l Conf. on Multimedia and Expo
– volume-title: The Computer Music Tutorial
  year: 1996
  ident: e_1_2_12_29_2
– ident: e_1_2_12_31_2
  doi: 10.1121/1.421129
– ident: e_1_2_12_38_2
  doi: 10.1145/218380.218422
– ident: e_1_2_12_13_2
  doi: 10.1076/jnmr.30.2.159.7114
– start-page: 206
  year: 2001
  ident: e_1_2_12_8_2
  article-title: Modeling tonality: Applications to music cognition
  publication-title: Proc. Annual Conf. of the Cognitive Science Society
– ident: e_1_2_12_28_2
  doi: 10.1145/566654.566608
– ident: e_1_2_12_10_2
  doi: 10.2307/3680012
– ident: e_1_2_12_30_2
  doi: 10.2307/3680495
– ident: e_1_2_12_21_2
  doi: 10.1145/311535.311539
– ident: e_1_2_12_5_2
  doi: 10.1121/1.400476
– volume: 22
  start-page: 402
  issue: 3
  year: 2000
  ident: e_1_2_12_3_2
  article-title: Style machines
  publication-title: ACM Trans. on Graphics
– ident: e_1_2_12_23_2
  doi: 10.1145/566654.566604
– ident: e_1_2_12_32_2
  doi: 10.1145/1015706.1015753
– year: 2004
  ident: e_1_2_12_35_2
  article-title: Automatic music summarization in compressed domain
  publication-title: Proc. IEEE Int'l Conf. on Acoustics, Speech, and Signal Processing
– year: 2004
  ident: e_1_2_12_27_2
  article-title: Finding music beats and tempo by using an image processing technique
  publication-title: Proc. Int'l Conf. on Information Technology for Application
– ident: e_1_2_12_16_2
  doi: 10.1109/TCOM.1967.1089532
– ident: e_1_2_12_39_2
  doi: 10.1145/1015706.1015756
– ident: e_1_2_12_34_2
  doi: 10.1109/AFGR.2004.1301641
– start-page: 140
  year: 2003
  ident: e_1_2_12_24_2
  article-title: Automated extraction of music snippets
  publication-title: Proc. ACM Multimedia
– ident: e_1_2_12_36_2
  doi: 10.1080/09298219408570647
– volume: 21
  start-page: 473
  issue: 3
  year: 2002
  ident: e_1_2_12_17_2
  article-title: Motion graphs
  publication-title: ACM Trans. on Graphics
  doi: 10.1145/566654.566605
– ident: e_1_2_12_11_2
  doi: 10.1145/280814.280820
– ident: e_1_2_12_4_2
  doi: 10.7551/mitpress/1486.001.0001
– year: 2000
  ident: e_1_2_12_19_2
  article-title: Music summarization using key phrases
  publication-title: Proc. IEEE Int'l Conf. on Acoustics, Speech, and Signal Processing
– ident: e_1_2_12_18_2
  doi: 10.1145/882262.882283
– ident: e_1_2_12_12_2
  doi: 10.1145/1015706.1015755
– ident: e_1_2_12_6_2
  doi: 10.1145/218380.218421
– ident: e_1_2_12_20_2
  doi: 10.1145/566654.566607
– ident: e_1_2_12_7_2
  doi: 10.1016/0167-6393(93)90037-L
– ident: e_1_2_12_33_2
  doi: 10.1109/TSA.2004.841053
SSID ssj0004765
Score 2.1721325
Snippet In computer graphics, considerable research has been conducted on realistic human motion synthesis. However, most research does not consider human emotional...
SourceID proquest
crossref
wiley
istex
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 449
SubjectTerms Analysis
Animation
Computer graphics
Computer programming
Dance
Movement
Music
Studies
Title Dancing-to-Music Character Animation
URI https://api.istex.fr/ark:/67375/WNG-8707H7VJ-C/fulltext.pdf
https://onlinelibrary.wiley.com/doi/abs/10.1111%2Fj.1467-8659.2006.00964.x
https://www.proquest.com/docview/194659082
https://www.proquest.com/docview/29654512
Volume 25
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LSwMxEA5SL3rwLdZnDyJetuwjyeweS7GWgj2Ij95CNs2CVLbSB4gnf4K_0V_izD6qVQ9FvCyBzSybTCb5Jpl8w9gpJ1ZwcI1jpY8OisRSDMJ3tMetECZw-zFtDVx1ZfuWd3qiV8Q_0V2YnB9ituFGlpHN12TgOh7_NPJQiqg8U4gkrxOepNAtwkfXn0xSHKQoab6JQGY-qOfXD82tVMvU6c9zMPQrmM1Wo9Y6G5TtyINQBvXpJK6bl28Uj__T0A22VoDWWiMfZZtsyaZbbPULleE2O6fhg6X317fJEB9Z_uhas6SDrjXSh_yS5A67bV3cNNtOkYXBMRw4d3xrQ7Bch8JEqNbQ9rWrwU-CBITG1V-DNhK9ptAmQYAGnSSAKNOCm3gQR9oEu6ySDlO7x2oBVgKEQP0kQsfMJLGJ3AA9JpwU_NCEosqg7HFlCopyypTxqOZcFVDUF5RAk0LysC_Uc5V5M8mnnKZjAZmzTKkzAT0aUJgbCHXfvVQ4kUEb7jqqWWUHpdZVYeFj5UVcZvniq-xk9hZNk85bdGqH07HyI4n41MMaItPvwn-mmpctLOz_Ue6AreQbRRQJd8gqk9HUHiF0msTHmVF8ACKgB2I
linkProvider Wiley-Blackwell
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3JTsMwEB0hOAAHdkRZe0CIS6osdiY5oopSth4Q281yXEdCoBRBKyFOfALfyJcwk6VQ4IAQl8hSPFHs8dhv7PEbgG3BrODoGseGPjkoIZUSlL6jPWGlNIHbTXhr4LQTti_E0bW8LtMB8V2Ygh9iuOHGlpHP12zgvCH93cqjUMbVoUIcigYByglO8J37V2cfXFICQ1kRfTOFzGhYz49fGlmrJrjbn0aA6Gc4m69HrVm4q1pShKHcNgb9pGGev5A8_lNT52CmxK31vWKgzcOYzRZg-hOb4SLs8gii0tvLa79HjzyFdL1ZMULX97Kb4p7kEly09s-bbadMxOAYgUI4vrURWqEjaWLSbGS72tXop0GKUhMA0KhNSI5TZNMgIJtOUySgadFNPUxibYJlGM96mV2BekCVkFBQN43JNzNpYmI3IKeJ5gU_MpGsAVZdrkzJUs7JMu7UiLeCivuCc2hyVB71hXqqgTeUvC-YOn4hs5NrdSigH2450g2luuocKJrLsI2XR6pZg7VK7ao08kflxTTOOGV8DbaGb8k6-chFZ7Y3eFR-HBJE9aiGzBX86z9TzYMWFVb_KLcFk-3z0xN1ctg5XoOpYt-IA-PWYbz_MLAbhKT6yWZuIe-uFAt9
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LT-MwEB4hkNDugfeK8uwBIS6p8rA9yREVSmGXCiFYuFmOa0uoKEXQSogTP4HfyC9hnEehwAEhLpGleJLY47G_sSffAGwxxwqOvvaMCMlBEVRKkYeeCpjhXEd-N3VbA8cd0T5nR5f8sox_cv_CFPwQow03Zxn5fO0M_KZrPxp5LHhSnSkkgjUIT07RK2M3wvdOX6mkGApe8Xw7BpnxqJ5PnzS2VE25Xr8fw6Fv0Wy-HLVmoVc1pIhC6TWGg7ShH95xPP5MS-dgpkSt9d1imM3DhMkW4PcbLsNF2HHjh0rPj0-DPl3yBNL1ZsUHXd_Nroq_JJfgvLV_1mx7ZRoGTzNkzAuNidEwFXOdkF5j01W-wtBGFrmi5V-h0oLcptjYKCKLthYJZhr0bYBponT0ByazfmaWoR5RJSQM1LUJeWbapjrxI3KZaFYIYx3zGmDV41KXHOUuVca1HPNVULq-cBk0XUwe9YW8r0EwkrwpeDq-ILOdK3UkoG57Ls4NubzoHEiaybCN_49kswarldZlaeJ3MkiYyBPG12BzdJds0x24qMz0h3cyTAQB1IBq8Fy_X_4y2TxoUWHlm3KbMH2y15L_Djt_V-FXsWnkouLWYHJwOzTrBKMG6UZuHy-Hxgo1
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Dancing-to-Music+Character+Animation&rft.jtitle=Computer+graphics+forum&rft.au=Shiratori%2C+Takaaki&rft.au=Nakazawa%2C+Atsushi&rft.au=Ikeuchi%2C+Katsushi&rft.date=2006-09-01&rft.pub=Blackwell+Publishing+Ltd&rft.issn=0167-7055&rft.eissn=1467-8659&rft.volume=25&rft.issue=3&rft.spage=449&rft_id=info:doi/10.1111%2Fj.1467-8659.2006.00964.x&rft.externalDBID=NO_FULL_TEXT&rft.externalDocID=1175633541
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0167-7055&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0167-7055&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0167-7055&client=summon