HRDepthNet: Depth Image-Based Marker-Less Tracking of Body Joints
With approaches for the detection of joint positions in color images such as HRNet and OpenPose being available, consideration of corresponding approaches for depth images is limited even though depth images have several advantages over color images like robustness to light variation or color- and t...
Saved in:
Published in | Sensors (Basel, Switzerland) Vol. 21; no. 4; p. 1356 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Switzerland
MDPI
14.02.2021
MDPI AG |
Subjects | |
Online Access | Get full text |
ISSN | 1424-8220 1424-8220 |
DOI | 10.3390/s21041356 |
Cover
Loading…
Abstract | With approaches for the detection of joint positions in color images such as HRNet and OpenPose being available, consideration of corresponding approaches for depth images is limited even though depth images have several advantages over color images like robustness to light variation or color- and texture invariance. Correspondingly, we introduce High- Resolution Depth Net (HRDepthNet)—a machine learning driven approach to detect human joints (body, head, and upper and lower extremities) in purely depth images. HRDepthNet retrains the original HRNet for depth images. Therefore, a dataset is created holding depth (and RGB) images recorded with subjects conducting the timed up and go test—an established geriatric assessment. The images were manually annotated RGB images. The training and evaluation were conducted with this dataset. For accuracy evaluation, detection of body joints was evaluated via COCO’s evaluation metrics and indicated that the resulting depth image-based model achieved better results than the HRNet trained and applied on corresponding RGB images. An additional evaluation of the position errors showed a median deviation of 1.619 cm (x-axis), 2.342 cm (y-axis) and 2.4 cm (z-axis). |
---|---|
AbstractList | With approaches for the detection of joint positions in color images such as HRNet and OpenPose being available, consideration of corresponding approaches for depth images is limited even though depth images have several advantages over color images like robustness to light variation or color- and texture invariance. Correspondingly, we introduce High- Resolution Depth Net (HRDepthNet)—a machine learning driven approach to detect human joints (body, head, and upper and lower extremities) in purely depth images. HRDepthNet retrains the original HRNet for depth images. Therefore, a dataset is created holding depth (and RGB) images recorded with subjects conducting the timed up and go test—an established geriatric assessment. The images were manually annotated RGB images. The training and evaluation were conducted with this dataset. For accuracy evaluation, detection of body joints was evaluated via COCO’s evaluation metrics and indicated that the resulting depth image-based model achieved better results than the HRNet trained and applied on corresponding RGB images. An additional evaluation of the position errors showed a median deviation of 1.619 cm (x-axis), 2.342 cm (y-axis) and 2.4 cm (z-axis). With approaches for the detection of joint positions in color images such as HRNet and OpenPose being available, consideration of corresponding approaches for depth images is limited even though depth images have several advantages over color images like robustness to light variation or color- and texture invariance. Correspondingly, we introduce High- Resolution Depth Net (HRDepthNet)—a machine learning driven approach to detect human joints (body, head, and upper and lower extremities) in purely depth images. HRDepthNet retrains the original HRNet for depth images. Therefore, a dataset is created holding depth (and RGB) images recorded with subjects conducting the timed up and go test—an established geriatric assessment. The images were manually annotated RGB images. The training and evaluation were conducted with this dataset. For accuracy evaluation, detection of body joints was evaluated via COCO’s evaluation metrics and indicated that the resulting depth image-based model achieved better results than the HRNet trained and applied on corresponding RGB images. An additional evaluation of the position errors showed a median deviation of 1.619 cm ( x -axis), 2.342 cm ( y -axis) and 2.4 cm ( z -axis). With approaches for the detection of joint positions in color images such as HRNet and OpenPose being available, consideration of corresponding approaches for depth images is limited even though depth images have several advantages over color images like robustness to light variation or color- and texture invariance. Correspondingly, we introduce High- Resolution Depth Net (HRDepthNet)-a machine learning driven approach to detect human joints (body, head, and upper and lower extremities) in purely depth images. HRDepthNet retrains the original HRNet for depth images. Therefore, a dataset is created holding depth (and RGB) images recorded with subjects conducting the timed up and go test-an established geriatric assessment. The images were manually annotated RGB images. The training and evaluation were conducted with this dataset. For accuracy evaluation, detection of body joints was evaluated via COCO's evaluation metrics and indicated that the resulting depth image-based model achieved better results than the HRNet trained and applied on corresponding RGB images. An additional evaluation of the position errors showed a median deviation of 1.619 cm ( -axis), 2.342 cm ( -axis) and 2.4 cm ( -axis). With approaches for the detection of joint positions in color images such as HRNet and OpenPose being available, consideration of corresponding approaches for depth images is limited even though depth images have several advantages over color images like robustness to light variation or color- and texture invariance. Correspondingly, we introduce High- Resolution Depth Net (HRDepthNet)-a machine learning driven approach to detect human joints (body, head, and upper and lower extremities) in purely depth images. HRDepthNet retrains the original HRNet for depth images. Therefore, a dataset is created holding depth (and RGB) images recorded with subjects conducting the timed up and go test-an established geriatric assessment. The images were manually annotated RGB images. The training and evaluation were conducted with this dataset. For accuracy evaluation, detection of body joints was evaluated via COCO's evaluation metrics and indicated that the resulting depth image-based model achieved better results than the HRNet trained and applied on corresponding RGB images. An additional evaluation of the position errors showed a median deviation of 1.619 cm (x-axis), 2.342 cm (y-axis) and 2.4 cm (z-axis).With approaches for the detection of joint positions in color images such as HRNet and OpenPose being available, consideration of corresponding approaches for depth images is limited even though depth images have several advantages over color images like robustness to light variation or color- and texture invariance. Correspondingly, we introduce High- Resolution Depth Net (HRDepthNet)-a machine learning driven approach to detect human joints (body, head, and upper and lower extremities) in purely depth images. HRDepthNet retrains the original HRNet for depth images. Therefore, a dataset is created holding depth (and RGB) images recorded with subjects conducting the timed up and go test-an established geriatric assessment. The images were manually annotated RGB images. The training and evaluation were conducted with this dataset. For accuracy evaluation, detection of body joints was evaluated via COCO's evaluation metrics and indicated that the resulting depth image-based model achieved better results than the HRNet trained and applied on corresponding RGB images. An additional evaluation of the position errors showed a median deviation of 1.619 cm (x-axis), 2.342 cm (y-axis) and 2.4 cm (z-axis). |
Author | Büker, Linda Christin Zuber, Finnja Hein, Andreas Fudickar, Sebastian |
AuthorAffiliation | Assistance Systems and Medical Device Technology, Department of Health Services Research, Carl von Ossietzky University Oldenburg, 26129 Oldenburg, Germany; linda.christin.bueker@uni-oldenburg.de (L.C.B.); finnja.zuber@gmx.de (F.Z.); andreas.hein@uni-oldenburg.de (A.H.) |
AuthorAffiliation_xml | – name: Assistance Systems and Medical Device Technology, Department of Health Services Research, Carl von Ossietzky University Oldenburg, 26129 Oldenburg, Germany; linda.christin.bueker@uni-oldenburg.de (L.C.B.); finnja.zuber@gmx.de (F.Z.); andreas.hein@uni-oldenburg.de (A.H.) |
Author_xml | – sequence: 1 givenname: Linda Christin surname: Büker fullname: Büker, Linda Christin – sequence: 2 givenname: Finnja surname: Zuber fullname: Zuber, Finnja – sequence: 3 givenname: Andreas orcidid: 0000-0001-8846-2282 surname: Hein fullname: Hein, Andreas – sequence: 4 givenname: Sebastian orcidid: 0000-0002-3553-5131 surname: Fudickar fullname: Fudickar, Sebastian |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/33672984$$D View this record in MEDLINE/PubMed |
BookMark | eNplkctuFDEQRS0URB6w4AdQL2HRxM-2zQIpCY8MGkBCs7dq7OqJk572YHuQ8vd0MkmUwKpKVbfOVekekr0xjUjIa0bfC2HpceGMSiZU94wcMMllazine4_6fXJYyiWlXAhhXpB9ITrNrZEH5OT81yfc1IsfWD80t10zW8MK21MoGJrvkK8wt3MspVlk8FdxXDWpb05TuG6-pTjW8pI872Eo-OquHpHFl8-Ls_N2_vPr7Oxk3npFdW3FUhqquaKcMSt9BwEV01J23KLwni-NYlR3xmIHKhjkIDu0VlnkqlO9OCKzHTYkuHSbHNeQr12C6G4HKa8c5Br9gM7w3igZPFhjJestgFY6APilDkoJmFgfd6zNdrnG4HGsGYYn0KebMV64VfrjtGUTmU-At3eAnH5vsVS3jsXjMMCIaVscl9ZMn2raTdI3j70eTO4jmATHO4HPqZSMvfOxQo3pxjoOjlF3E7J7CHm6ePfPxT30f-1fY0-jxw |
CitedBy_id | crossref_primary_10_3390_s21123944 crossref_primary_10_1016_j_aei_2022_101596 crossref_primary_10_3390_s22145282 crossref_primary_10_3390_ijerph19031179 crossref_primary_10_3390_s24196329 |
Cites_doi | 10.1109/CVPR.2019.00584 10.1109/EMBC.2012.6346149 10.1109/ICRA.2011.5980567 10.1155/2015/186780 10.1109/CVPR.2010.5540141 10.1109/CVPR.2011.5995316 10.1109/SIBGRA.2003.1240987 10.1109/MRA.2018.2852795 10.1109/TPAMI.2019.2929257 10.1111/jgs.16135 10.1590/bjpt-rbf.2014.0067 10.1186/1687-6180-2012-36 10.1155/2016/5036857 10.1109/ICCV.2011.6126310 10.3389/fneur.2019.00283 10.1145/2366145.2366207 10.1007/978-3-030-61609-0_21 10.1017/CBO9780511811685 10.3390/s17071591 10.3390/s19061370 10.3390/s18010014 10.1109/SACI.2011.5873039 10.3390/s18103310 10.1007/978-3-030-59830-3 10.3390/s20102824 10.1016/j.medengphy.2017.03.007 10.1109/ICHI48887.2020.9374363 |
ContentType | Journal Article |
Copyright | 2021 by the authors. 2021 |
Copyright_xml | – notice: 2021 by the authors. 2021 |
DBID | AAYXX CITATION CGR CUY CVF ECM EIF NPM 7X8 5PM DOA |
DOI | 10.3390/s21041356 |
DatabaseName | CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) MEDLINE - Academic |
DatabaseTitleList | CrossRef MEDLINE MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 1424-8220 |
ExternalDocumentID | oai_doaj_org_article_82f854dca98941f9aa757daacb7d553a PMC7918542 33672984 10_3390_s21041356 |
Genre | Journal Article |
GroupedDBID | --- 123 2WC 53G 5VS 7X7 88E 8FE 8FG 8FI 8FJ AADQD AAHBH AAYXX ABDBF ABUWG ACUHS ADBBV ADMLS AENEX AFKRA AFZYC ALIPV ALMA_UNASSIGNED_HOLDINGS BENPR BPHCQ BVXVI CCPQU CITATION CS3 D1I DU5 E3Z EBD ESX F5P FYUFA GROUPED_DOAJ GX1 HH5 HMCUK HYE IAO ITC KQ8 L6V M1P M48 MODMG M~E OK1 OVT P2P P62 PHGZM PHGZT PIMPY PQQKQ PROAC PSQYO RNS RPM TUS UKHRP XSB ~8M 3V. ABJCF ARAPS CGR CUY CVF ECM EIF HCIFZ KB. M7S NPM PDBOC 7X8 PJZUB PPXIY PUEGO 5PM |
ID | FETCH-LOGICAL-c507t-3b480725021194c6ade51744629e3cc2b85107689e6a5d8e2a46e9959e2565f3 |
IEDL.DBID | M48 |
ISSN | 1424-8220 |
IngestDate | Wed Aug 27 01:14:57 EDT 2025 Thu Aug 21 17:51:58 EDT 2025 Thu Sep 04 17:35:49 EDT 2025 Wed Feb 19 02:28:13 EST 2025 Thu Apr 24 23:08:04 EDT 2025 Tue Jul 01 03:56:05 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 4 |
Keywords | marker-less tracking timed “up & TUG 5 × SST go” test depth camera machine learning algorithm |
Language | English |
License | https://creativecommons.org/licenses/by/4.0 Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c507t-3b480725021194c6ade51744629e3cc2b85107689e6a5d8e2a46e9959e2565f3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ORCID | 0000-0001-8846-2282 0000-0002-3553-5131 |
OpenAccessLink | http://journals.scholarsportal.info/openUrl.xqy?doi=10.3390/s21041356 |
PMID | 33672984 |
PQID | 2498502706 |
PQPubID | 23479 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_82f854dca98941f9aa757daacb7d553a pubmedcentral_primary_oai_pubmedcentral_nih_gov_7918542 proquest_miscellaneous_2498502706 pubmed_primary_33672984 crossref_citationtrail_10_3390_s21041356 crossref_primary_10_3390_s21041356 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20210214 |
PublicationDateYYYYMMDD | 2021-02-14 |
PublicationDate_xml | – month: 2 year: 2021 text: 20210214 day: 14 |
PublicationDecade | 2020 |
PublicationPlace | Switzerland |
PublicationPlace_xml | – name: Switzerland |
PublicationTitle | Sensors (Basel, Switzerland) |
PublicationTitleAlternate | Sensors (Basel) |
PublicationYear | 2021 |
Publisher | MDPI MDPI AG |
Publisher_xml | – name: MDPI – name: MDPI AG |
References | Moreira (ref_2) 2015; 19 ref_14 ref_13 ref_12 ref_34 ref_33 ref_32 ref_31 ref_30 ref_19 Eltoukhy (ref_4) 2017; 44 ref_18 ref_17 ref_15 Cao (ref_16) 2019; 43 Peters (ref_10) 2015; 15 ref_25 ref_24 ref_23 ref_22 Suchi (ref_35) 2019; 26 ref_21 ref_20 ref_1 Jung (ref_11) 2019; 67 ref_3 ref_29 ref_28 ref_27 ref_26 ref_9 ref_8 Castelli (ref_6) 2015; 2015 ref_5 ref_7 |
References_xml | – ident: ref_17 doi: 10.1109/CVPR.2019.00584 – ident: ref_30 – ident: ref_32 – ident: ref_31 doi: 10.1109/EMBC.2012.6346149 – ident: ref_34 – ident: ref_26 doi: 10.1109/ICRA.2011.5980567 – volume: 2015 start-page: 186780 year: 2015 ident: ref_6 article-title: A 2D Markerless Gait Analysis Methodology: Validation on Healthy Subjects publication-title: Comput. Math. Methods Med. doi: 10.1155/2015/186780 – ident: ref_21 doi: 10.1109/CVPR.2010.5540141 – ident: ref_22 doi: 10.1109/CVPR.2011.5995316 – ident: ref_27 doi: 10.1109/SIBGRA.2003.1240987 – volume: 26 start-page: 67 year: 2019 ident: ref_35 article-title: An Empirical Evaluation of Ten Depth Cameras: Bias, Precision, Lateral Noise, Different Lighting Conditions and Materials, and Multiple Sensor Setups in Indoor Environments publication-title: IEEE Robot. Autom. Mag. doi: 10.1109/MRA.2018.2852795 – volume: 43 start-page: 172 year: 2019 ident: ref_16 article-title: OpenPose: Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2019.2929257 – volume: 67 start-page: 2605 year: 2019 ident: ref_11 article-title: Validation of a Multi–Sensor-Based Kiosk for Short Physical Performance Battery publication-title: J. Am. Geriatr. Soc. doi: 10.1111/jgs.16135 – ident: ref_18 – volume: 19 start-page: 61 year: 2015 ident: ref_2 article-title: Spatiotemporal gait parameters and recurrent falls in community-dwelling elderly women: A prospective study publication-title: Braz. J. Phys. Ther. doi: 10.1590/bjpt-rbf.2014.0067 – ident: ref_24 doi: 10.1186/1687-6180-2012-36 – ident: ref_14 doi: 10.1155/2016/5036857 – ident: ref_20 doi: 10.1109/ICCV.2011.6126310 – ident: ref_3 doi: 10.3389/fneur.2019.00283 – ident: ref_23 doi: 10.1145/2366145.2366207 – ident: ref_25 – ident: ref_19 doi: 10.1007/978-3-030-61609-0_21 – ident: ref_29 – ident: ref_33 – volume: 15 start-page: 1 year: 2015 ident: ref_10 article-title: Description of spatio-temporal gait parameters in elderly people and their association with history of falls: Results of the population-based cross-sectional KORA-Age study publication-title: BMC Geriatr. – ident: ref_28 doi: 10.1017/CBO9780511811685 – ident: ref_1 doi: 10.3390/s17071591 – ident: ref_12 doi: 10.3390/s19061370 – ident: ref_9 doi: 10.3390/s18010014 – ident: ref_5 doi: 10.1109/SACI.2011.5873039 – ident: ref_8 doi: 10.3390/s18103310 – ident: ref_13 doi: 10.1007/978-3-030-59830-3 – ident: ref_7 doi: 10.3390/s20102824 – volume: 44 start-page: 1 year: 2017 ident: ref_4 article-title: Microsoft Kinect can distinguish differences in over-ground gait between older persons with and without Parkinson’s disease publication-title: Med. Eng. Phys. doi: 10.1016/j.medengphy.2017.03.007 – ident: ref_15 doi: 10.1109/ICHI48887.2020.9374363 |
SSID | ssj0023338 |
Score | 2.374029 |
Snippet | With approaches for the detection of joint positions in color images such as HRNet and OpenPose being available, consideration of corresponding approaches for... |
SourceID | doaj pubmedcentral proquest pubmed crossref |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
StartPage | 1356 |
SubjectTerms | 5 × SST Aged Color depth camera go” test Humans Joints Machine Learning marker-less tracking Postural Balance Time and Motion Studies timed “up & TUG |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LS8QwEA7iSQ_i2_VFFA9egtokTeptV11WUQ-i4K3kiYK2i1sP_ntn2u6yK4IXb6UNTZhpMt_XTL4h5IhbLVySeXYWomFCy5QZq1PmuVSnLgkiCjwofHefDp7EzbN8nir1hTlhjTxwY7gTnUQthXcGlcLPYmaMksob46zyUvIaGkHMG5OplmpxYF6NjhAHUn8yAmIDqzVWqZ6KPrVI_2_I8meC5FTE6S-TpRYq0m4zxBUyF4pVsjglILhGuoOHyzCsXu5DdU7rK3r9DisE60Fw8hQP4oQPdgurGYWg5PC3OC0j7ZX-i96Ur0U1WieP_avHiwFriyIwB9CtYtziIXAALijNJlxqfECxaZEmWeDOJRYgFO6uZSE10uuQGJEGFBULAG5k5BtkviiLsEWoh1dJnMHBoSqhtNG6aG3UQGqcylSHHI9tlbtWMBzrVrzlQBzQrPnErB1yOGk6bFQyfmvUQ4NPGqCwdX0D3J237s7_cneHHIzdlcNEwN0NU4Tyc5QDj9RgFXUKHW027pt0xXkKJEKLDlEzjp0Zy-yT4vWlFttWGSAakWz_x-B3yEKCKTFYT0bskvnq4zPsAaap7H79-X4D8HD0gQ priority: 102 providerName: Directory of Open Access Journals |
Title | HRDepthNet: Depth Image-Based Marker-Less Tracking of Body Joints |
URI | https://www.ncbi.nlm.nih.gov/pubmed/33672984 https://www.proquest.com/docview/2498502706 https://pubmed.ncbi.nlm.nih.gov/PMC7918542 https://doaj.org/article/82f854dca98941f9aa757daacb7d553a |
Volume | 21 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9NAEB71cYED4t3wiAziwGUh9T6NhFADDaGiEapaKTdrX6aVil0SV6L_nhnHsWqUAxfLstdea8az883uzjcAb7gzwqdZYPuxsEwYqZh1RrHApR75NIpCUKLw8UxNz8TRXM63YF1jsxXgcmNoR_WkzhaX7_78vvmEBv-RIk4M2d8vMWzBsViqbdhFh6To5z4W3WJCynlT0Jpyuhj6w9GKYKj_aM8tNez9myDnvzsnb7miyX2412LI5GCl9AewFcuHcPcWs-AjOJiefIlX9fks1h-S5iz59guHDjZGrxUSytCJC_Ydh7kEvZWn-fKkKpJxFW6So-qirJeP4XRyePp5ytpqCcwjpqsZd5QdjoiGONuEVzZEYqEWKs0i9z51iK1o2S2LyspgYmqFisQ2FhH1yII_gZ2yKuMeJAFfJcm0oye6QukK5wvnCoPRjteZHsDbtaxy3zKJU0GLyxwjChJr3ol1AK-7plcr-oxNjcYk8K4BMV43F6rFz7w1oNykhZEieEuM8ftFZq2WOljrnQ5ScjuAV2t15WghtOxhy1hdL3MMMA1KRY-wo6cr9XVdca4wujBiALqn2N639O-UF-cNC7fOEOqI9Nl_9Psc7qS0FYbqyIgXsFMvruNLxDK1G8K2nms8msnXIeyOD2c_TobNvMCw-Yf_ApYN9LE |
linkProvider | Scholars Portal |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=HRDepthNet%3A+Depth+Image-Based+Marker-Less+Tracking+of+Body+Joints&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=B%C3%BCker%2C+Linda+Christin&rft.au=Zuber%2C+Finnja&rft.au=Hein%2C+Andreas&rft.au=Fudickar%2C+Sebastian&rft.date=2021-02-14&rft.issn=1424-8220&rft.eissn=1424-8220&rft.volume=21&rft.issue=4&rft_id=info:doi/10.3390%2Fs21041356&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon |