Generative adversarial networks to create synthetic motion capture datasets including subject and gait characteristics

Resource-intensive motion capture (mocap) systems challenge predictive deep learning applications, requiring large and diverse datasets. We tackled this by modifying generative adversarial networks (GANs) into conditional GANs (cGANs) that can generate diverse mocap data, including 15 marker traject...

Full description

Saved in:
Bibliographic Details
Published inJournal of biomechanics Vol. 177; p. 112358
Main Authors Bicer, Metin, Phillips, Andrew T.M., Melis, Alessandro, McGregor, Alison H., Modenese, Luca
Format Journal Article
LanguageEnglish
Published United States Elsevier Ltd 01.12.2024
Elsevier Limited
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Resource-intensive motion capture (mocap) systems challenge predictive deep learning applications, requiring large and diverse datasets. We tackled this by modifying generative adversarial networks (GANs) into conditional GANs (cGANs) that can generate diverse mocap data, including 15 marker trajectories, lower limb joint angles, and 3D ground reaction forces (GRFs), based on specified subject and gait characteristics. The cGAN comprised 1) an encoder compressing mocap data to a latent vector, 2) a decoder reconstructing the mocap data from the latent vector with specific conditions and 3) a discriminator distinguishing random vectors with conditions from encoded latent vectors with conditions. Single-conditional models were trained separately for age, sex, leg length, mass, and walking speed, while an additional model (Multi-cGAN) combined all conditions simultaneously to generate synthetic data. All models closely replicated the training dataset (<8.1 % of the gait cycle different between experimental and synthetic kinematics and GRFs), while a subset with narrow condition ranges was best replicated by the Multi-cGAN, producing similar kinematics (<1°) and GRFs (<0.02 body-weight) averaged by walking speeds. Multi-cGAN also generated synthetic datasets and results for three previous studies using reported mean and standard deviation of subject and gait characteristics. Additionally, unseen test data was best predicted by the walking speed-conditional, showcasing synthetic data diversity. The same model also matched the dynamical consistency of the experimental data (32 % average difference throughout the gait cycle), meaning that transforming the gait cycle data to the original time domain yielded accurate derivative calculations. Importantly, synthetic data poses no privacy concerns, potentially facilitating data sharing.
AbstractList Resource-intensive motion capture (mocap) systems challenge predictive deep learning applications, requiring large and diverse datasets. We tackled this by modifying generative adversarial networks (GANs) into conditional GANs (cGANs) that can generate diverse mocap data, including 15 marker trajectories, lower limb joint angles, and 3D ground reaction forces (GRFs), based on specified subject and gait characteristics. The cGAN comprised 1) an encoder compressing mocap data to a latent vector, 2) a decoder reconstructing the mocap data from the latent vector with specific conditions and 3) a discriminator distinguishing random vectors with conditions from encoded latent vectors with conditions. Single-conditional models were trained separately for age, sex, leg length, mass, and walking speed, while an additional model (Multi-cGAN) combined all conditions simultaneously to generate synthetic data. All models closely replicated the training dataset (<8.1 % of the gait cycle different between experimental and synthetic kinematics and GRFs), while a subset with narrow condition ranges was best replicated by the Multi-cGAN, producing similar kinematics (<1°) and GRFs (<0.02 body-weight) averaged by walking speeds. Multi-cGAN also generated synthetic datasets and results for three previous studies using reported mean and standard deviation of subject and gait characteristics. Additionally, unseen test data was best predicted by the walking speed-conditional, showcasing synthetic data diversity. The same model also matched the dynamical consistency of the experimental data (32 % average difference throughout the gait cycle), meaning that transforming the gait cycle data to the original time domain yielded accurate derivative calculations. Importantly, synthetic data poses no privacy concerns, potentially facilitating data sharing.
Resource-intensive motion capture (mocap) systems challenge predictive deep learning applications, requiring large and diverse datasets. We tackled this by modifying generative adversarial networks (GANs) into conditional GANs (cGANs) that can generate diverse mocap data, including 15 marker trajectories, lower limb joint angles, and 3D ground reaction forces (GRFs), based on specified subject and gait characteristics. The cGAN comprised 1) an encoder compressing mocap data to a latent vector, 2) a decoder reconstructing the mocap data from the latent vector with specific conditions and 3) a discriminator distinguishing random vectors with conditions from encoded latent vectors with conditions. Single-conditional models were trained separately for age, sex, leg length, mass, and walking speed, while an additional model (Multi-cGAN) combined all conditions simultaneously to generate synthetic data. All models closely replicated the training dataset (<8.1 % of the gait cycle different between experimental and synthetic kinematics and GRFs), while a subset with narrow condition ranges was best replicated by the Multi-cGAN, producing similar kinematics (<1°) and GRFs (<0.02 body-weight) averaged by walking speeds. Multi-cGAN also generated synthetic datasets and results for three previous studies using reported mean and standard deviation of subject and gait characteristics. Additionally, unseen test data was best predicted by the walking speed-conditional, showcasing synthetic data diversity. The same model also matched the dynamical consistency of the experimental data (32 % average difference throughout the gait cycle), meaning that transforming the gait cycle data to the original time domain yielded accurate derivative calculations. Importantly, synthetic data poses no privacy concerns, potentially facilitating data sharing.Resource-intensive motion capture (mocap) systems challenge predictive deep learning applications, requiring large and diverse datasets. We tackled this by modifying generative adversarial networks (GANs) into conditional GANs (cGANs) that can generate diverse mocap data, including 15 marker trajectories, lower limb joint angles, and 3D ground reaction forces (GRFs), based on specified subject and gait characteristics. The cGAN comprised 1) an encoder compressing mocap data to a latent vector, 2) a decoder reconstructing the mocap data from the latent vector with specific conditions and 3) a discriminator distinguishing random vectors with conditions from encoded latent vectors with conditions. Single-conditional models were trained separately for age, sex, leg length, mass, and walking speed, while an additional model (Multi-cGAN) combined all conditions simultaneously to generate synthetic data. All models closely replicated the training dataset (<8.1 % of the gait cycle different between experimental and synthetic kinematics and GRFs), while a subset with narrow condition ranges was best replicated by the Multi-cGAN, producing similar kinematics (<1°) and GRFs (<0.02 body-weight) averaged by walking speeds. Multi-cGAN also generated synthetic datasets and results for three previous studies using reported mean and standard deviation of subject and gait characteristics. Additionally, unseen test data was best predicted by the walking speed-conditional, showcasing synthetic data diversity. The same model also matched the dynamical consistency of the experimental data (32 % average difference throughout the gait cycle), meaning that transforming the gait cycle data to the original time domain yielded accurate derivative calculations. Importantly, synthetic data poses no privacy concerns, potentially facilitating data sharing.
Resource-intensive motion capture (mocap) systems challenge predictive deep learning applications, requiring large and diverse datasets. We tackled this by modifying generative adversarial networks (GANs) into conditional GANs (cGANs) that can generate diverse mocap data, including 15 marker trajectories, lower limb joint angles, and 3D ground reaction forces (GRFs), based on specified subject and gait characteristics. The cGAN comprised 1) an encoder compressing mocap data to a latent vector, 2) a decoder reconstructing the mocap data from the latent vector with specific conditions and 3) a discriminator distinguishing random vectors with conditions from encoded latent vectors with conditions. Single-conditional models were trained separately for age, sex, leg length, mass, and walking speed, while an additional model (Multi-cGAN) combined all conditions simultaneously to generate synthetic data. All models closely replicated the training dataset (<8.1 % of the gait cycle different between experimental and synthetic kinematics and GRFs), while a subset with narrow condition ranges was best replicated by the Multi-cGAN, producing similar kinematics (<1°) and GRFs (<0.02 body-weight) averaged by walking speeds. Multi-cGAN also generated synthetic datasets and results for three previous studies using reported mean and standard deviation of subject and gait characteristics. Additionally, unseen test data was best predicted by the walking speed-conditional, showcasing synthetic data diversity. The same model also matched the dynamical consistency of the experimental data (32 % average difference throughout the gait cycle), meaning that transforming the gait cycle data to the original time domain yielded accurate derivative calculations. Importantly, synthetic data poses no privacy concerns, potentially facilitating data sharing.
ArticleNumber 112358
Author McGregor, Alison H.
Melis, Alessandro
Modenese, Luca
Bicer, Metin
Phillips, Andrew T.M.
Author_xml – sequence: 1
  givenname: Metin
  orcidid: 0000-0002-9491-2080
  surname: Bicer
  fullname: Bicer, Metin
  organization: Department of Civil and Environmental Engineering, Imperial College London, London, UK
– sequence: 2
  givenname: Andrew T.M.
  orcidid: 0000-0001-6618-0145
  surname: Phillips
  fullname: Phillips, Andrew T.M.
  organization: Department of Civil and Environmental Engineering, Imperial College London, London, UK
– sequence: 3
  givenname: Alessandro
  orcidid: 0000-0002-8261-0421
  surname: Melis
  fullname: Melis, Alessandro
  organization: Vivacity, London, UK
– sequence: 4
  givenname: Alison H.
  orcidid: 0000-0003-4672-332X
  surname: McGregor
  fullname: McGregor, Alison H.
  organization: Department of Surgery and Cancer, Imperial College London, London, UK
– sequence: 5
  givenname: Luca
  orcidid: 0000-0003-1402-5359
  surname: Modenese
  fullname: Modenese, Luca
  email: l.modenese@unsw.edu.au
  organization: Department of Civil and Environmental Engineering, Imperial College London, London, UK
BackLink https://www.ncbi.nlm.nih.gov/pubmed/39509807$$D View this record in MEDLINE/PubMed
BookMark eNqNkcFu1DAURS1URKeFX6gssWGTwc9OYmeDQBUUpEpsYG059kvHaWIPtjNo_p5U07LoBlZvc-7R070X5CzEgIRcAdsCg_b9uB17H2e0uy1nvN4CcNGoF2QDSoqKC8XOyIYxDlXHO3ZOLnIeGWOylt0rci66hnWKyQ053GDAZIo_IDXugCmb5M1EA5bfMd1nWiK1CU1Bmo-h7LB4S-dYfAzUmn1ZElJnislYMvXBTovz4Y7mpR_RFmqCo3fGF2p3JhlbMPm8GvJr8nIwU8Y3j_eS_Pzy-cf11-r2-82360-3la3rtlRDj6ZveNOq1hreg-BsGBS3IKVrhXKdhFY0jRkEANbAlDTSDUzwvm0G1TpxSd6dvPsUfy2Yi559tjhNJmBcshbAlQAFElb07TN0jEsK63crVUPDoW7ZSl09Uks_o9P75GeTjvqp0RVoT4BNMeeEw18EmH6YTo_6aTr9MJ0-TbcGP56CuPZx8Jh0th6DRefTWqV20f9b8eGZwk4-eGumezz-j-APdyO6-Q
Cites_doi 10.1109/CVPR.2019.00453
10.1016/j.jbiomech.2013.07.031
10.1016/j.jbiomech.2021.110451
10.1016/j.jbiomech.2018.09.009
10.1016/S0003-9993(98)90013-2
10.3389/fbioe.2020.00041
10.3390/s21175876
10.1016/j.jbiomech.2008.03.015
10.1038/s41597-019-0124-4
10.1016/j.gaitpost.2009.07.002
10.1016/0966-6362(94)90106-6
10.1016/j.gaitpost.2021.05.014
10.1371/journal.pcbi.1006223
10.1016/j.compmedimag.2019.101684
10.1016/j.jbiomech.2022.111301
10.1016/0966-6362(95)01057-2
10.1155/2017/6432969
10.1016/j.jbiomech.2017.04.014
10.1109/TBME.2016.2586891
10.1007/978-3-319-66179-7_48
10.1038/s41598-019-45397-4
10.1109/CVPR.2017.632
10.1016/j.gaitpost.2014.12.011
10.1016/j.otsr.2011.08.015
10.1115/1.4029304
10.1080/14763141.2016.1246603
ContentType Journal Article
Copyright 2024 The Authors
Copyright © 2024 The Authors. Published by Elsevier Ltd.. All rights reserved.
2024. The Authors
Copyright_xml – notice: 2024 The Authors
– notice: Copyright © 2024 The Authors. Published by Elsevier Ltd.. All rights reserved.
– notice: 2024. The Authors
DBID 6I.
AAFTH
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
3V.
7QP
7TB
7TS
7X7
7XB
88E
8AO
8FD
8FE
8FH
8FI
8FJ
8FK
8G5
ABUWG
AFKRA
AZQEC
BBNVY
BENPR
BHPHI
CCPQU
DWQXO
FR3
FYUFA
GHDGH
GNUQQ
GUQSH
HCIFZ
K9.
LK8
M0S
M1P
M2O
M7P
MBDVC
PHGZM
PHGZT
PJZUB
PKEHL
PPXIY
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
Q9U
7X8
DOI 10.1016/j.jbiomech.2024.112358
DatabaseName ScienceDirect Open Access Titles
Elsevier:ScienceDirect:Open Access
CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
ProQuest Central (Corporate)
Calcium & Calcified Tissue Abstracts
Mechanical & Transportation Engineering Abstracts
Physical Education Index
ProQuest Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Medical Database (Alumni Edition)
ProQuest Pharma Collection
Technology Research Database
ProQuest SciTech Collection
ProQuest Natural Science Collection
Hospital Premium Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Research Library
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
Biological Science Collection
ProQuest Central
Natural Science Collection
ProQuest One
ProQuest Central Korea
Engineering Research Database
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Central Student
Research Library Prep
SciTech Premium Collection
ProQuest Health & Medical Complete (Alumni)
Biological Sciences
Health & Medical Collection (Alumni)
Medical Database
Research Library
Biological Science Database
Research Library (Corporate)
ProQuest Central Premium
ProQuest One Academic
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
ProQuest Central Basic
MEDLINE - Academic
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Research Library Prep
ProQuest Central Student
Technology Research Database
ProQuest One Academic Middle East (New)
Mechanical & Transportation Engineering Abstracts
ProQuest Central Essentials
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest One Health & Nursing
Research Library (Alumni Edition)
ProQuest Natural Science Collection
ProQuest Pharma Collection
ProQuest Central China
Physical Education Index
ProQuest Central
ProQuest One Applied & Life Sciences
ProQuest Health & Medical Research Collection
Health Research Premium Collection
Health and Medicine Complete (Alumni Edition)
Natural Science Collection
ProQuest Central Korea
Health & Medical Research Collection
Biological Science Collection
ProQuest Research Library
ProQuest Central (New)
ProQuest Medical Library (Alumni)
ProQuest Biological Science Collection
ProQuest Central Basic
ProQuest One Academic Eastern Edition
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
Biological Science Database
ProQuest SciTech Collection
ProQuest Hospital Collection (Alumni)
ProQuest Health & Medical Complete
ProQuest Medical Library
ProQuest One Academic UKI Edition
Engineering Research Database
ProQuest One Academic
Calcium & Calcified Tissue Abstracts
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList

MEDLINE
MEDLINE - Academic
Research Library Prep
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 3
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
Engineering
Anatomy & Physiology
EISSN 1873-2380
ExternalDocumentID 39509807
10_1016_j_jbiomech_2024_112358
S0021929024004366
Genre Journal Article
GroupedDBID ---
--K
--M
--Z
-~X
.1-
.55
.FO
.GJ
.~1
0R~
1B1
1P~
1RT
1~.
1~5
29J
4.4
457
4G.
53G
5GY
5VS
7-5
71M
7X7
88E
8AO
8FE
8FH
8FI
8FJ
8G5
8P~
9JM
9JN
AABNK
AAEDT
AAEDW
AAIKJ
AAKOC
AALRI
AAOAW
AAQFI
AAQQT
AAQXK
AATTM
AAXKI
AAXUO
AAYWO
ABBQC
ABFNM
ABJNI
ABMAC
ABMZM
ABUWG
ABWVN
ABXDB
ACDAQ
ACGFS
ACIEU
ACIUM
ACIWK
ACNNM
ACPRK
ACRLP
ACRPL
ACVFH
ADBBV
ADCNI
ADEZE
ADMUD
ADNMO
ADTZH
AEBSH
AECPX
AEIPS
AEKER
AENEX
AEUPX
AEVXI
AFJKZ
AFKRA
AFPUW
AFRHN
AFTJW
AFXIZ
AGCQF
AGHFR
AGQPQ
AGUBO
AGYEJ
AHHHB
AHJVU
AHMBA
AI.
AIEXJ
AIGII
AIIUN
AIKHN
AITUG
AJRQY
AJUYK
AKBMS
AKRWK
AKYEP
ALMA_UNASSIGNED_HOLDINGS
AMRAJ
ANKPU
ANZVX
APXCP
ASPBG
AVWKF
AXJTR
AZFZN
AZQEC
BBNVY
BENPR
BHPHI
BJAXD
BKOJK
BLXMC
BNPGV
BPHCQ
BVXVI
CCPQU
CS3
DU5
DWQXO
EBD
EBS
EFJIC
EFKBS
EJD
EO8
EO9
EP2
EP3
F5P
FDB
FEDTE
FGOYB
FIRID
FNPLU
FYGXN
FYUFA
G-2
G-Q
GBLVA
GNUQQ
GUQSH
HCIFZ
HEE
HMCUK
HMK
HMO
HVGLF
HZ~
H~9
I-F
IHE
J1W
JJJVA
KOM
LK8
M1P
M29
M2O
M31
M41
M7P
ML~
MO0
MVM
N9A
O-L
O9-
OAUVE
OH.
OHT
OT.
OZT
P-8
P-9
P2P
PC.
PHGZM
PHGZT
PJZUB
PPXIY
PQGLB
PQQKQ
PROAC
PSQYO
PUEGO
Q38
R2-
ROL
RPZ
SAE
SCC
SDF
SDG
SDP
SEL
SES
SEW
SJN
SPC
SPCBC
SSH
SST
SSZ
T5K
UKHRP
UPT
VH1
WUQ
X7M
XOL
XPP
YQT
Z5R
ZGI
ZMT
~G-
3V.
6I.
AACTN
AAFTH
AFCTW
AFFDN
AFKWA
AJOXV
ALIPV
AMFUW
F3I
RIG
YCJ
AAYXX
AGRNS
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7QP
7TB
7TS
7XB
8FD
8FK
FR3
K9.
MBDVC
PKEHL
PQEST
PQUKI
PRINS
Q9U
7X8
ID FETCH-LOGICAL-c446t-fbeab525686ca2b1320ff82c177d638d9716355af311e41087a7df032b65f86d3
IEDL.DBID .~1
ISSN 0021-9290
1873-2380
IngestDate Fri Jul 11 00:36:42 EDT 2025
Wed Aug 13 08:04:49 EDT 2025
Thu Apr 03 07:03:45 EDT 2025
Tue Jul 01 00:44:26 EDT 2025
Sat Dec 14 16:14:51 EST 2024
Tue Aug 26 16:32:09 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Keywords Gait
Conditional Generative Adversarial Networks
Synthetic Mocap Dataset
Language English
License This is an open access article under the CC BY license.
Copyright © 2024 The Authors. Published by Elsevier Ltd.. All rights reserved.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c446t-fbeab525686ca2b1320ff82c177d638d9716355af311e41087a7df032b65f86d3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0001-6618-0145
0000-0003-1402-5359
0000-0002-8261-0421
0000-0003-4672-332X
0000-0002-9491-2080
OpenAccessLink https://www.sciencedirect.com/science/article/pii/S0021929024004366
PMID 39509807
PQID 3141521460
PQPubID 1226346
ParticipantIDs proquest_miscellaneous_3128318171
proquest_journals_3141521460
pubmed_primary_39509807
crossref_primary_10_1016_j_jbiomech_2024_112358
elsevier_sciencedirect_doi_10_1016_j_jbiomech_2024_112358
elsevier_clinicalkey_doi_10_1016_j_jbiomech_2024_112358
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate December 2024
2024-12-00
2024-Dec
20241201
PublicationDateYYYYMMDD 2024-12-01
PublicationDate_xml – month: 12
  year: 2024
  text: December 2024
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: Kidlington
PublicationTitle Journal of biomechanics
PublicationTitleAlternate J Biomech
PublicationYear 2024
Publisher Elsevier Ltd
Elsevier Limited
Publisher_xml – sequence: 0
  name: Elsevier Ltd
– name: Elsevier Ltd
– name: Elsevier Limited
References Røislien, Skare, Gustavsen, Broch, Rennie, Opheim (b0130) 2009; 30
Kerrigan, Todd, Della Croce, Lipsitz, Collins (b0065) 1998; 79
Schreiber, Moissenet (b0145) 2019; 6
Nie, D., Trullo, R., Lian, J., Petitjean, C., Ruan, S., Wang, Q., Shen, D., 2017. Medical image synthesis with context-aware generative adversarial networks. In MICCAI 2017 Proceedings, Part III 20.
Choi, Biswal, Malin, Duke, Stewart, Sun (b0025) 2017
Rajagopal, Dembia, DeMers, Delp, Hicks, Delp (b0115) 2016; 63
Knudson (b0070) 2017; 16
Bicer, Phillips, Melis, McGregor, Modenese (b0010) 2022; 144
Hof (b0050) 1996; 3
Kouyoumdjian, Coulomb, Sanchez, Asencio (b0075) 2012; 98
Armanious, Jiang, Fischer, Küstner, Hepp, Nikolaou, Gatidis, Yang (b0005) 2020; 79
Goodfellow, Pouget-Abadie, Mirza, Xu, Warde-Farley, Ozair, Courville, Bengio (b0035) 2014
Bruening, Frimenko, Goodyear, Bowden, Fullenkamp (b0015) 2015; 41
Isola, P., Zhu, J.-Y., Zhou, T., Efros, A.A., 2017. Image-to-image translation with conditional adversarial networks. In Proceedings of the IEEE conference on computer vision and pattern recognition.
Karras, T., Laine, S., Aila, T., 2019. A style-based generator architecture for generative adversarial networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 4401-4410.
Salimans, Goodfellow, Zaremba, Cheung, Radford, Chen (b0140) 2016; 29
Moissenet, Leboeuf, Armand (b0085) 2019; 9
Zhou, S., Gordon, M.L., Krishna, R., Narcomey, A., Fei-Fei, L., Bernstein, M.S., 2019. Hype: A benchmark for human eye perceptual evaluation of generative models. arXiv preprint arXiv:1904.01121.
Sharifi Renani, Eustace, Myers, Clary (b0160) 2021; 21
Esteban, C., Hyland, S.L., Rätsch, G., 2017. Real-valued (medical) time series generation with recurrent conditional gans. arXiv preprint arXiv:1706.02633.
Raissi, M., Perdikaris, P., Karniadakis, G.E., 2017. Physics informed deep learning (part i): Data-driven solutions of nonlinear partial differential equations. arXiv preprint arXiv:1711.10561.
Halilaj, Rajagopal, Fiterau, Hicks, Hastie, Delp (b0040) 2018; 81
Robinson, Vanrenterghem, Pataky (b0125) 2021; 122
Weinhandl, Irmischer, Sievert (b0165) 2017
Pataky, Robinson, Vanrenterghem (b0105) 2013; 46
Mirza, M., Osindero, S., 2014. Conditional generative adversarial nets. arXiv preprint arXiv:1411.1784.
Ramesh, A., Dhariwal, P., Nichol, A., Chu, C., Chen, M., 2022. Hierarchical text-conditional image generation with clip latents. arXiv preprint arXiv:2204.06125, 1, 3.
Xu, Skoularidou, Cuesta-Infante, Veeramachaneni (b0170) 2019; 32
Nigg, Fisher, Ronsky (b0100) 1994; 2
Schwartz, Rozumalski, Trost (b0150) 2008; 41
Hicks, Uchida, Seth, Rajagopal, Delp (b0045) 2015; 137
Mundt, Koeppe, David, Witter, Bamer, Potthast, Markert (b0090) 2020; 8
Chehab, Andriacchi, Favre (b0020) 2017; 58
Rowe, Beauchamp, Wilson (b0135) 2021; 88
Seth, Hicks, Uchida, Habib, Dembia, Dunne, Ong, DeMers, Rajagopal, Millard (b0155) 2018; 14
Rajagopal (10.1016/j.jbiomech.2024.112358_b0115) 2016; 63
Rowe (10.1016/j.jbiomech.2024.112358_b0135) 2021; 88
Hof (10.1016/j.jbiomech.2024.112358_b0050) 1996; 3
10.1016/j.jbiomech.2024.112358_b0175
10.1016/j.jbiomech.2024.112358_b0055
10.1016/j.jbiomech.2024.112358_b0110
Bicer (10.1016/j.jbiomech.2024.112358_b0010) 2022; 144
Nigg (10.1016/j.jbiomech.2024.112358_b0100) 1994; 2
Kouyoumdjian (10.1016/j.jbiomech.2024.112358_b0075) 2012; 98
Moissenet (10.1016/j.jbiomech.2024.112358_b0085) 2019; 9
Halilaj (10.1016/j.jbiomech.2024.112358_b0040) 2018; 81
Kerrigan (10.1016/j.jbiomech.2024.112358_b0065) 1998; 79
10.1016/j.jbiomech.2024.112358_b0080
10.1016/j.jbiomech.2024.112358_b0060
Robinson (10.1016/j.jbiomech.2024.112358_b0125) 2021; 122
Hicks (10.1016/j.jbiomech.2024.112358_b0045) 2015; 137
Salimans (10.1016/j.jbiomech.2024.112358_b0140) 2016; 29
Weinhandl (10.1016/j.jbiomech.2024.112358_b0165) 2017
Mundt (10.1016/j.jbiomech.2024.112358_b0090) 2020; 8
Schwartz (10.1016/j.jbiomech.2024.112358_b0150) 2008; 41
Knudson (10.1016/j.jbiomech.2024.112358_b0070) 2017; 16
10.1016/j.jbiomech.2024.112358_b0120
Pataky (10.1016/j.jbiomech.2024.112358_b0105) 2013; 46
Schreiber (10.1016/j.jbiomech.2024.112358_b0145) 2019; 6
Chehab (10.1016/j.jbiomech.2024.112358_b0020) 2017; 58
Choi (10.1016/j.jbiomech.2024.112358_b0025) 2017
10.1016/j.jbiomech.2024.112358_b0095
10.1016/j.jbiomech.2024.112358_b0030
Røislien (10.1016/j.jbiomech.2024.112358_b0130) 2009; 30
Goodfellow (10.1016/j.jbiomech.2024.112358_b0035) 2014
Seth (10.1016/j.jbiomech.2024.112358_b0155) 2018; 14
Xu (10.1016/j.jbiomech.2024.112358_b0170) 2019; 32
Bruening (10.1016/j.jbiomech.2024.112358_b0015) 2015; 41
Sharifi Renani (10.1016/j.jbiomech.2024.112358_b0160) 2021; 21
Armanious (10.1016/j.jbiomech.2024.112358_b0005) 2020; 79
References_xml – volume: 98
  start-page: 17
  year: 2012
  end-page: 23
  ident: b0075
  article-title: Clinical evaluation of hip joint rotation range of motion in adults
  publication-title: Orthop. Traumatol. Surg. Res.
– volume: 6
  start-page: 111
  year: 2019
  ident: b0145
  article-title: A multimodal dataset of human gait at different walking speeds established on injury-free adult participants
  publication-title: Sci. Data
– volume: 2
  start-page: 213
  year: 1994
  end-page: 220
  ident: b0100
  article-title: Gait characteristics as a function of age and gender
  publication-title: Gait Posture
– volume: 14
  start-page: e1006223
  year: 2018
  ident: b0155
  article-title: Opensim: Simulating musculoskeletal dynamics and neuromuscular control to study human and animal movement
  publication-title: PLoS Comput. Biol.
– volume: 81
  start-page: 1
  year: 2018
  end-page: 11
  ident: b0040
  article-title: Machine learning in human movement biomechanics: Best practices, common pitfalls, and new opportunities
  publication-title: J. Biomech.
– volume: 9
  start-page: 9510
  year: 2019
  ident: b0085
  article-title: Lower limb sagittal gait kinematics can be predicted based on walking speed, gender, age and bmi
  publication-title: Sci. Rep.
– volume: 79
  year: 2020
  ident: b0005
  article-title: Medgan: Medical image translation using gans
  publication-title: Comput. Med. Imaging Graph.
– volume: 41
  start-page: 540
  year: 2015
  end-page: 545
  ident: b0015
  article-title: Sex differences in whole body gait kinematics at preferred speeds
  publication-title: Gait Posture
– volume: 3
  start-page: 222
  year: 1996
  end-page: 223
  ident: b0050
  article-title: Scaling gait data to body size
  publication-title: Gait Posture
– volume: 63
  start-page: 2068
  year: 2016
  end-page: 2079
  ident: b0115
  article-title: Full-body musculoskeletal model for muscle-driven simulation of human gait
  publication-title: IEEE Trans. Biomed. Eng.
– volume: 79
  start-page: 317
  year: 1998
  end-page: 322
  ident: b0065
  article-title: Biomechanical gait alterations independent of speed in the healthy elderly: Evidence for specific limiting impairments
  publication-title: Arch. Phys. Med. Rehabil.
– start-page: 1
  year: 2017
  end-page: 7
  ident: b0165
  article-title: Effects of gait speed of femoroacetabular joint forces
  publication-title: Appl. Bionics Biomech.
– volume: 41
  start-page: 1639
  year: 2008
  end-page: 1650
  ident: b0150
  article-title: The effect of walking speed on the gait of typically developing children
  publication-title: J. Biomech.
– reference: Esteban, C., Hyland, S.L., Rätsch, G., 2017. Real-valued (medical) time series generation with recurrent conditional gans. arXiv preprint arXiv:1706.02633.
– volume: 8
  start-page: 41
  year: 2020
  ident: b0090
  article-title: Estimation of gait mechanics based on simulated and measured imu data using an artificial neural network
  publication-title: Front. Bioeng. Biotechnol.
– reference: Ramesh, A., Dhariwal, P., Nichol, A., Chu, C., Chen, M., 2022. Hierarchical text-conditional image generation with clip latents. arXiv preprint arXiv:2204.06125, 1, 3.
– volume: 46
  start-page: 2394
  year: 2013
  end-page: 2401
  ident: b0105
  article-title: Vector field statistical analysis of kinematic and force trajectories
  publication-title: J. Biomech.
– volume: 21
  start-page: 5876
  year: 2021
  ident: b0160
  article-title: The use of synthetic imu signals in the training of deep learning models significantly improves the accuracy of joint kinematic predictions
  publication-title: Sensors
– start-page: 2672
  year: 2014
  end-page: 2680
  ident: b0035
  article-title: Generative adversarial nets
  publication-title: Int. Conf. Neural Inf. Process. Syst.
– reference: Isola, P., Zhu, J.-Y., Zhou, T., Efros, A.A., 2017. Image-to-image translation with conditional adversarial networks. In Proceedings of the IEEE conference on computer vision and pattern recognition.
– volume: 29
  start-page: 2234
  year: 2016
  end-page: 2242
  ident: b0140
  article-title: Improved techniques for training gans
  publication-title: Advances in neural information processing systems.
– volume: 144
  year: 2022
  ident: b0010
  article-title: Generative deep learning applied to biomechanics: A new augmentation technique for motion capture datasets
  publication-title: J. Biomech.
– volume: 16
  start-page: 425
  year: 2017
  end-page: 433
  ident: b0070
  article-title: Confidence crisis of results in biomechanics research
  publication-title: Sports Biomech.
– volume: 58
  start-page: 11
  year: 2017
  end-page: 20
  ident: b0020
  article-title: Speed, age, sex, and body mass index provide a rigorous basis for comparing the kinematic and kinetic profiles of the lower extremity during walking
  publication-title: J. Biomech.
– reference: Karras, T., Laine, S., Aila, T., 2019. A style-based generator architecture for generative adversarial networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 4401-4410.
– volume: 137
  year: 2015
  ident: b0045
  article-title: Is my model good enough? Best practices for verification and validation of musculoskeletal models and simulations of movement
  publication-title: J. Biomech. Eng.
– volume: 122
  year: 2021
  ident: b0125
  article-title: Sample size estimation for biomechanical waveforms: Current practice, recommendations and a comparison to discrete power analysis
  publication-title: J. Biomech.
– start-page: 286
  year: 2017
  end-page: 305
  ident: b0025
  article-title: Generating multi-label discrete patient records using generative adversarial networks
  publication-title: Machine learning for healthcare conference
– reference: Zhou, S., Gordon, M.L., Krishna, R., Narcomey, A., Fei-Fei, L., Bernstein, M.S., 2019. Hype: A benchmark for human eye perceptual evaluation of generative models. arXiv preprint arXiv:1904.01121.
– volume: 30
  start-page: 441
  year: 2009
  end-page: 445
  ident: b0130
  article-title: Simultaneous estimation of effects of gender, age and walking speed on kinematic gait data
  publication-title: Gait Posture
– reference: Mirza, M., Osindero, S., 2014. Conditional generative adversarial nets. arXiv preprint arXiv:1411.1784.
– volume: 32
  year: 2019
  ident: b0170
  article-title: Modeling tabular data using conditional gan
  publication-title: Advances in Neural Information Processing Systems
– volume: 88
  start-page: 109
  year: 2021
  end-page: 115
  ident: b0135
  article-title: Age and sex differences in normative gait patterns
  publication-title: Gait Posture
– reference: Raissi, M., Perdikaris, P., Karniadakis, G.E., 2017. Physics informed deep learning (part i): Data-driven solutions of nonlinear partial differential equations. arXiv preprint arXiv:1711.10561.
– reference: Nie, D., Trullo, R., Lian, J., Petitjean, C., Ruan, S., Wang, Q., Shen, D., 2017. Medical image synthesis with context-aware generative adversarial networks. In MICCAI 2017 Proceedings, Part III 20.
– ident: 10.1016/j.jbiomech.2024.112358_b0060
  doi: 10.1109/CVPR.2019.00453
– volume: 46
  start-page: 2394
  year: 2013
  ident: 10.1016/j.jbiomech.2024.112358_b0105
  article-title: Vector field statistical analysis of kinematic and force trajectories
  publication-title: J. Biomech.
  doi: 10.1016/j.jbiomech.2013.07.031
– volume: 122
  year: 2021
  ident: 10.1016/j.jbiomech.2024.112358_b0125
  article-title: Sample size estimation for biomechanical waveforms: Current practice, recommendations and a comparison to discrete power analysis
  publication-title: J. Biomech.
  doi: 10.1016/j.jbiomech.2021.110451
– ident: 10.1016/j.jbiomech.2024.112358_b0120
– volume: 81
  start-page: 1
  year: 2018
  ident: 10.1016/j.jbiomech.2024.112358_b0040
  article-title: Machine learning in human movement biomechanics: Best practices, common pitfalls, and new opportunities
  publication-title: J. Biomech.
  doi: 10.1016/j.jbiomech.2018.09.009
– volume: 79
  start-page: 317
  year: 1998
  ident: 10.1016/j.jbiomech.2024.112358_b0065
  article-title: Biomechanical gait alterations independent of speed in the healthy elderly: Evidence for specific limiting impairments
  publication-title: Arch. Phys. Med. Rehabil.
  doi: 10.1016/S0003-9993(98)90013-2
– volume: 8
  start-page: 41
  year: 2020
  ident: 10.1016/j.jbiomech.2024.112358_b0090
  article-title: Estimation of gait mechanics based on simulated and measured imu data using an artificial neural network
  publication-title: Front. Bioeng. Biotechnol.
  doi: 10.3389/fbioe.2020.00041
– volume: 21
  start-page: 5876
  year: 2021
  ident: 10.1016/j.jbiomech.2024.112358_b0160
  article-title: The use of synthetic imu signals in the training of deep learning models significantly improves the accuracy of joint kinematic predictions
  publication-title: Sensors
  doi: 10.3390/s21175876
– volume: 41
  start-page: 1639
  year: 2008
  ident: 10.1016/j.jbiomech.2024.112358_b0150
  article-title: The effect of walking speed on the gait of typically developing children
  publication-title: J. Biomech.
  doi: 10.1016/j.jbiomech.2008.03.015
– volume: 6
  start-page: 111
  year: 2019
  ident: 10.1016/j.jbiomech.2024.112358_b0145
  article-title: A multimodal dataset of human gait at different walking speeds established on injury-free adult participants
  publication-title: Sci. Data
  doi: 10.1038/s41597-019-0124-4
– ident: 10.1016/j.jbiomech.2024.112358_b0030
– volume: 30
  start-page: 441
  year: 2009
  ident: 10.1016/j.jbiomech.2024.112358_b0130
  article-title: Simultaneous estimation of effects of gender, age and walking speed on kinematic gait data
  publication-title: Gait Posture
  doi: 10.1016/j.gaitpost.2009.07.002
– volume: 2
  start-page: 213
  year: 1994
  ident: 10.1016/j.jbiomech.2024.112358_b0100
  article-title: Gait characteristics as a function of age and gender
  publication-title: Gait Posture
  doi: 10.1016/0966-6362(94)90106-6
– volume: 88
  start-page: 109
  year: 2021
  ident: 10.1016/j.jbiomech.2024.112358_b0135
  article-title: Age and sex differences in normative gait patterns
  publication-title: Gait Posture
  doi: 10.1016/j.gaitpost.2021.05.014
– volume: 14
  start-page: e1006223
  year: 2018
  ident: 10.1016/j.jbiomech.2024.112358_b0155
  article-title: Opensim: Simulating musculoskeletal dynamics and neuromuscular control to study human and animal movement
  publication-title: PLoS Comput. Biol.
  doi: 10.1371/journal.pcbi.1006223
– start-page: 286
  year: 2017
  ident: 10.1016/j.jbiomech.2024.112358_b0025
  article-title: Generating multi-label discrete patient records using generative adversarial networks
– ident: 10.1016/j.jbiomech.2024.112358_b0080
– volume: 79
  year: 2020
  ident: 10.1016/j.jbiomech.2024.112358_b0005
  article-title: Medgan: Medical image translation using gans
  publication-title: Comput. Med. Imaging Graph.
  doi: 10.1016/j.compmedimag.2019.101684
– volume: 144
  year: 2022
  ident: 10.1016/j.jbiomech.2024.112358_b0010
  article-title: Generative deep learning applied to biomechanics: A new augmentation technique for motion capture datasets
  publication-title: J. Biomech.
  doi: 10.1016/j.jbiomech.2022.111301
– start-page: 2672
  year: 2014
  ident: 10.1016/j.jbiomech.2024.112358_b0035
  article-title: Generative adversarial nets
  publication-title: Int. Conf. Neural Inf. Process. Syst.
– volume: 32
  year: 2019
  ident: 10.1016/j.jbiomech.2024.112358_b0170
  article-title: Modeling tabular data using conditional gan
  publication-title: Advances in Neural Information Processing Systems
– volume: 3
  start-page: 222
  year: 1996
  ident: 10.1016/j.jbiomech.2024.112358_b0050
  article-title: Scaling gait data to body size
  publication-title: Gait Posture
  doi: 10.1016/0966-6362(95)01057-2
– start-page: 1
  year: 2017
  ident: 10.1016/j.jbiomech.2024.112358_b0165
  article-title: Effects of gait speed of femoroacetabular joint forces
  publication-title: Appl. Bionics Biomech.
  doi: 10.1155/2017/6432969
– volume: 58
  start-page: 11
  year: 2017
  ident: 10.1016/j.jbiomech.2024.112358_b0020
  article-title: Speed, age, sex, and body mass index provide a rigorous basis for comparing the kinematic and kinetic profiles of the lower extremity during walking
  publication-title: J. Biomech.
  doi: 10.1016/j.jbiomech.2017.04.014
– volume: 63
  start-page: 2068
  year: 2016
  ident: 10.1016/j.jbiomech.2024.112358_b0115
  article-title: Full-body musculoskeletal model for muscle-driven simulation of human gait
  publication-title: IEEE Trans. Biomed. Eng.
  doi: 10.1109/TBME.2016.2586891
– ident: 10.1016/j.jbiomech.2024.112358_b0095
  doi: 10.1007/978-3-319-66179-7_48
– ident: 10.1016/j.jbiomech.2024.112358_b0110
– volume: 29
  start-page: 2234
  year: 2016
  ident: 10.1016/j.jbiomech.2024.112358_b0140
  article-title: Improved techniques for training gans
  publication-title: Advances in neural information processing systems.
– volume: 9
  start-page: 9510
  year: 2019
  ident: 10.1016/j.jbiomech.2024.112358_b0085
  article-title: Lower limb sagittal gait kinematics can be predicted based on walking speed, gender, age and bmi
  publication-title: Sci. Rep.
  doi: 10.1038/s41598-019-45397-4
– ident: 10.1016/j.jbiomech.2024.112358_b0055
  doi: 10.1109/CVPR.2017.632
– volume: 41
  start-page: 540
  year: 2015
  ident: 10.1016/j.jbiomech.2024.112358_b0015
  article-title: Sex differences in whole body gait kinematics at preferred speeds
  publication-title: Gait Posture
  doi: 10.1016/j.gaitpost.2014.12.011
– volume: 98
  start-page: 17
  year: 2012
  ident: 10.1016/j.jbiomech.2024.112358_b0075
  article-title: Clinical evaluation of hip joint rotation range of motion in adults
  publication-title: Orthop. Traumatol. Surg. Res.
  doi: 10.1016/j.otsr.2011.08.015
– volume: 137
  year: 2015
  ident: 10.1016/j.jbiomech.2024.112358_b0045
  article-title: Is my model good enough? Best practices for verification and validation of musculoskeletal models and simulations of movement
  publication-title: J. Biomech. Eng.
  doi: 10.1115/1.4029304
– volume: 16
  start-page: 425
  year: 2017
  ident: 10.1016/j.jbiomech.2024.112358_b0070
  article-title: Confidence crisis of results in biomechanics research
  publication-title: Sports Biomech.
  doi: 10.1080/14763141.2016.1246603
– ident: 10.1016/j.jbiomech.2024.112358_b0175
SSID ssj0007479
Score 2.4621043
Snippet Resource-intensive motion capture (mocap) systems challenge predictive deep learning applications, requiring large and diverse datasets. We tackled this by...
SourceID proquest
pubmed
crossref
elsevier
SourceType Aggregation Database
Index Database
Publisher
StartPage 112358
SubjectTerms Adult
Biomechanical Phenomena
Conditional Generative Adversarial Networks
Data compression
Datasets
Deep Learning
Design
Female
Gait
Gait - physiology
Generative adversarial networks
Humans
Kinematics
Male
Middle Aged
Motion Capture
Movement
Neural networks
Neural Networks, Computer
Synthetic data
Synthetic Mocap Dataset
Walking
Walking - physiology
SummonAdditionalLinks – databaseName: ProQuest Health & Medical Collection
  dbid: 7X7
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1ba9RAFD5oBbEPolsvq1VGEN_GZjJJJvskRSxFqE8W9i1M5iJd7OxqsoX-e8-ZmawVvD1vJsnm3L5zB3iNHoOvXC14Y4TklTOOa-sdL7xC84sSVsbZnWefmtPz6uOyXuaA25DLKiedGBW1XRuKkR9JEU1N1RTvNt84bY2i7GpeoXEb7tDoMuJqtdw5XDQbPpd4CI4woLjRIbx6u4r97TEhUVbUSSNp7fvvjdOfwGc0QicP4H5Gj-w4kfsh3HJhBgfHAT3ny2v2hsV6zhgon8H-jVGDM7h7lpPoB3CVRk2TnmOa9jEPmriQhVQRPrBxzSKWdGy4DggQ8WEsbfthRm8o5cCosHRw48Augvm6JfvHhm1PMR2mg2Vf9MXIzK-zoB_B-cmHz-9PeV6_wA36iCP3vdN9jZCobYwue-q19r4tjVDKotRaGj6FaEV7KYSrRNEqrawvZNk3tW8bKx_DXlgH9xRYv3CmRoDspTEVAqKF9K3tLS2baZrKFXM4mr57t0lTNrqp_GzVTZTqiFJdotQc1ESebuohRa3XoSH458nF7mRGGQk9_NfZw4kTuizrQ_eTM-fwavczSimlXnRw6y1dgzAOwZQSc3iSOGj3R-UCQVtbqGd_v_lzuEdvkkppDmFv_L51LxAQjf3LyPU_APiSCo4
  priority: 102
  providerName: ProQuest
Title Generative adversarial networks to create synthetic motion capture datasets including subject and gait characteristics
URI https://www.clinicalkey.com/#!/content/1-s2.0-S0021929024004366
https://dx.doi.org/10.1016/j.jbiomech.2024.112358
https://www.ncbi.nlm.nih.gov/pubmed/39509807
https://www.proquest.com/docview/3141521460
https://www.proquest.com/docview/3128318171
Volume 177
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1bi9QwFA7LCqIPorNeZl2XCOJbd5ombdrHcdllVHYQcWHeQpomMsOaGWxnYV_87Z6TtOMuKAq-tPQSesm5fEnO-Q4hb2DE4ITNWVIYxhNhjU1042ySOgnuFzQsC9ydF_Nidik-LPLFHjkdcmEwrLK3_dGmB2vdn5n0f3OyWS4xxxe0LauQpAt51JF2WwiJUn7y41eYB8DlPsyDJXj3rSzh1ckq5LiHRYlMYDYNx9Lvv3dQfwKgwRGdPyaPegRJp_Eln5A960fkYOph9Pzthr6lIaYzTJaPyMNbdIMjcv-iX0g_INeRbhptHdVYk7nVKInUx6jwlnZrGvCkpe2NB5AID6Ox4g81eoPLDhSDS1vbtXTpzdUWfSBttzXO61DtG_pVLztq7vJBPyWX52dfTmdJX4IhMTBO7BJXW13nAIvKwuisxnxr58rMMCkb0NwGCagAsWjHGbOCpaXUsnEpz-oid2XR8Gdk36-9fUFoXVmTA0h23BgBoKjirmzqBgvOFIWw6ZhMhv-uNpFpQw0haCs19JTCnlKxp8ZEDt2jhjxSsHwKnMFfW1a7lnek7Z_aHg2SoHp9bxVnAQiJAj7j9e4yaCouv2hv11u8B6AcACrJxuR5lKDdh_IKgFuZysP_eLGX5AEexVibI7Lffd_aV4CYuvo4qARs5UIek3vT9x9nc9i_O5t_-vwTdHcY_g
linkProvider Elsevier
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1LbxMxEB6VIkE5IEh5BAoYCbgtXe87B4QqoEpp01Mr5Wa8Xhs1Kk5gN6D8KX4jM_Y6FInXpeeNnWxmPN83nhfAM_QYTKZzHhWKp1GmlY5kY3QUmxLhF09Y4np3To6L8Wn2fppPN-B7qIWhtMpgE52hbuaK7sh3U-6gJivi14vPEU2NouhqGKHh1eJQr76hy9a-OniL8n2eJPvvTt6Mo36qQKTQ9ekiU2tZ54j0VaFkUlMJsTFVonhZNqiMDfVUQhCWJuVcZzyuSlk2Jk6TushNVTQp7nsFriLwxuTsldO1g0e96PuUEh4h7YgvVCTPXs5cPb0LgCQZVe6kNGb-92D4J7LrQG__Ftzs2Srb8-p1Gza0HcD2nkVP_dOKvWAuf9RdzA_gxoXWhgO4NumD9tvw1be2JrvKJM1_biVpPbM-A71l3Zw57qpZu7JISPHLmJ8uxJRcUIiDUSJrq7uWnVl1viS8Ze2ypjskJm3DPsqzjqlfe0_fgdNLEcxd2LRzq-8Dq0da5UjITapUhgRslJqqqRsablMUmY6HsBv-d7HwXT1ESHebiSApQZISXlJDKIN4RKhZRSsrEHj-uXK0XtmzGs9W_mvtTtAE0duWVvw8CUN4un6MVoFCPdLq-ZI-g7QRyVvJh3DPa9D6RdMRksQqLh_8ffMncH18MjkSRwfHhw9hi36VT-PZgc3uy1I_QjLW1Y_dCWDw4bKP3A9H8EZv
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1JbxMxFH4qqVTBAUHKEihgJOA2ZDybJweECm3UUhpViEq9uR6PjRq1k8BMQPlr_Dres2dCkdguPSeeLG_5vue3ATzDiMEmJuVBpnkcJEabQJXWBKEVCL9oYZGb3Xk4yfaOk3cn6ckafO96YaissvOJzlGXM0135MOYO6hJsnBo27KIo53x6_nngDZIUaa1W6fhVeTALL9h-Fa_2t9BWT-PovHux7d7QbthINAYBjWBLYwqUkT9PNMqKqid2No80lyIEhWzpPlKCMjKxpybhIe5UKK0YRwVWWrzrIzxuddgXVBU1IP1N7uTow8rHECi3haY8ABJSHipP3n6cuq66106JEqojyempfO_h8Y_UV8HgeNbcLPlrmzbK9ttWDNVHza3K4zbL5bsBXPVpO6avg83Lg067MPGYZvC34SvftA1eVmmaBt0rcgGWOXr0WvWzJhjsobVywrpKX4Y87uGmFZzSngwKmutTVOzs0qfLwh9Wb0o6EaJqapkn9RZw_Svk6jvwPGViOYu9KpZZe4DK0ZGp0jPbax1gnRsFNu8LEpadZNliQkHMOz-dzn3Mz5kV_w2lZ2kJElKekkNQHTikV0HK_pciTD0z5Oj1cmW43ju8l9ntzpNkK2nqeVPuxjA09XL6CMo8aMqM1vQe5BEIpUTfAD3vAatfmg8QsqYh-LB3x_-BDbQ3OT7_cnBQ7hOX8rX9GxBr_myMI-QmTXF49YEGJxetdX9AAlJTAo
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Generative+adversarial+networks+to+create+synthetic+motion+capture+datasets+including+subject+and+gait+characteristics&rft.jtitle=Journal+of+biomechanics&rft.au=Bicer%2C+Metin&rft.au=Phillips%2C+Andrew+T.M.&rft.au=Melis%2C+Alessandro&rft.au=McGregor%2C+Alison+H.&rft.date=2024-12-01&rft.issn=0021-9290&rft.volume=177&rft.spage=112358&rft_id=info:doi/10.1016%2Fj.jbiomech.2024.112358&rft.externalDBID=n%2Fa&rft.externalDocID=10_1016_j_jbiomech_2024_112358
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0021-9290&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0021-9290&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0021-9290&client=summon