A deep learning model for classifying human facial expressions from infrared thermal images
The analysis of human facial expressions from the thermal images captured by the Infrared Thermal Imaging (IRTI) cameras has recently gained importance compared to images captured by the standard cameras using light having a wavelength in the visible spectrum. It is because infrared cameras work wel...
Saved in:
Published in | Scientific reports Vol. 11; no. 1; pp. 20696 - 17 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
19.10.2021
Nature Publishing Group Nature Portfolio |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | The analysis of human facial expressions from the thermal images captured by the Infrared Thermal Imaging (IRTI) cameras has recently gained importance compared to images captured by the standard cameras using light having a wavelength in the visible spectrum. It is because infrared cameras work well in low-light conditions and also infrared spectrum captures thermal distribution that is very useful for building systems like Robot interaction systems, quantifying the cognitive responses from facial expressions, disease control, etc. In this paper, a deep learning model called
IRFacExNet
(
I
nfra
R
ed
Fac
ial
Ex
pression
Net
work) has been proposed for facial expression recognition (FER) from infrared images. It utilizes two building blocks namely Residual unit and Transformation unit which extract dominant features from the input images specific to the expressions. The extracted features help to detect the emotion of the subjects in consideration accurately. The Snapshot ensemble technique is adopted with a Cosine annealing learning rate scheduler to improve the overall performance. The performance of the proposed model has been evaluated on a publicly available dataset, namely
IRDatabase
developed by RWTH Aachen University. The facial expressions present in the dataset are Fear, Anger, Contempt, Disgust, Happy, Neutral, Sad, and Surprise. The proposed model produces 88.43% recognition accuracy, better than some state-of-the-art methods considered here for comparison. Our model provides a robust framework for the detection of accurate expression in the absence of visible light. |
---|---|
AbstractList | The analysis of human facial expressions from the thermal images captured by the Infrared Thermal Imaging (IRTI) cameras has recently gained importance compared to images captured by the standard cameras using light having a wavelength in the visible spectrum. It is because infrared cameras work well in low-light conditions and also infrared spectrum captures thermal distribution that is very useful for building systems like Robot interaction systems, quantifying the cognitive responses from facial expressions, disease control, etc. In this paper, a deep learning model called IRFacExNet (InfraRed Facial Expression Network) has been proposed for facial expression recognition (FER) from infrared images. It utilizes two building blocks namely Residual unit and Transformation unit which extract dominant features from the input images specific to the expressions. The extracted features help to detect the emotion of the subjects in consideration accurately. The Snapshot ensemble technique is adopted with a Cosine annealing learning rate scheduler to improve the overall performance. The performance of the proposed model has been evaluated on a publicly available dataset, namely IRDatabase developed by RWTH Aachen University. The facial expressions present in the dataset are Fear, Anger, Contempt, Disgust, Happy, Neutral, Sad, and Surprise. The proposed model produces 88.43% recognition accuracy, better than some state-of-the-art methods considered here for comparison. Our model provides a robust framework for the detection of accurate expression in the absence of visible light.The analysis of human facial expressions from the thermal images captured by the Infrared Thermal Imaging (IRTI) cameras has recently gained importance compared to images captured by the standard cameras using light having a wavelength in the visible spectrum. It is because infrared cameras work well in low-light conditions and also infrared spectrum captures thermal distribution that is very useful for building systems like Robot interaction systems, quantifying the cognitive responses from facial expressions, disease control, etc. In this paper, a deep learning model called IRFacExNet (InfraRed Facial Expression Network) has been proposed for facial expression recognition (FER) from infrared images. It utilizes two building blocks namely Residual unit and Transformation unit which extract dominant features from the input images specific to the expressions. The extracted features help to detect the emotion of the subjects in consideration accurately. The Snapshot ensemble technique is adopted with a Cosine annealing learning rate scheduler to improve the overall performance. The performance of the proposed model has been evaluated on a publicly available dataset, namely IRDatabase developed by RWTH Aachen University. The facial expressions present in the dataset are Fear, Anger, Contempt, Disgust, Happy, Neutral, Sad, and Surprise. The proposed model produces 88.43% recognition accuracy, better than some state-of-the-art methods considered here for comparison. Our model provides a robust framework for the detection of accurate expression in the absence of visible light. Abstract The analysis of human facial expressions from the thermal images captured by the Infrared Thermal Imaging (IRTI) cameras has recently gained importance compared to images captured by the standard cameras using light having a wavelength in the visible spectrum. It is because infrared cameras work well in low-light conditions and also infrared spectrum captures thermal distribution that is very useful for building systems like Robot interaction systems, quantifying the cognitive responses from facial expressions, disease control, etc. In this paper, a deep learning model called IRFacExNet (InfraRed Facial Expression Network) has been proposed for facial expression recognition (FER) from infrared images. It utilizes two building blocks namely Residual unit and Transformation unit which extract dominant features from the input images specific to the expressions. The extracted features help to detect the emotion of the subjects in consideration accurately. The Snapshot ensemble technique is adopted with a Cosine annealing learning rate scheduler to improve the overall performance. The performance of the proposed model has been evaluated on a publicly available dataset, namely IRDatabase developed by RWTH Aachen University. The facial expressions present in the dataset are Fear, Anger, Contempt, Disgust, Happy, Neutral, Sad, and Surprise. The proposed model produces 88.43% recognition accuracy, better than some state-of-the-art methods considered here for comparison. Our model provides a robust framework for the detection of accurate expression in the absence of visible light. The analysis of human facial expressions from the thermal images captured by the Infrared Thermal Imaging (IRTI) cameras has recently gained importance compared to images captured by the standard cameras using light having a wavelength in the visible spectrum. It is because infrared cameras work well in low-light conditions and also infrared spectrum captures thermal distribution that is very useful for building systems like Robot interaction systems, quantifying the cognitive responses from facial expressions, disease control, etc. In this paper, a deep learning model called IRFacExNet ( I nfra R ed Fac ial Ex pression Net work) has been proposed for facial expression recognition (FER) from infrared images. It utilizes two building blocks namely Residual unit and Transformation unit which extract dominant features from the input images specific to the expressions. The extracted features help to detect the emotion of the subjects in consideration accurately. The Snapshot ensemble technique is adopted with a Cosine annealing learning rate scheduler to improve the overall performance. The performance of the proposed model has been evaluated on a publicly available dataset, namely IRDatabase developed by RWTH Aachen University. The facial expressions present in the dataset are Fear, Anger, Contempt, Disgust, Happy, Neutral, Sad, and Surprise. The proposed model produces 88.43% recognition accuracy, better than some state-of-the-art methods considered here for comparison. Our model provides a robust framework for the detection of accurate expression in the absence of visible light. The analysis of human facial expressions from the thermal images captured by the Infrared Thermal Imaging (IRTI) cameras has recently gained importance compared to images captured by the standard cameras using light having a wavelength in the visible spectrum. It is because infrared cameras work well in low-light conditions and also infrared spectrum captures thermal distribution that is very useful for building systems like Robot interaction systems, quantifying the cognitive responses from facial expressions, disease control, etc. In this paper, a deep learning model called IRFacExNet (InfraRed Facial Expression Network) has been proposed for facial expression recognition (FER) from infrared images. It utilizes two building blocks namely Residual unit and Transformation unit which extract dominant features from the input images specific to the expressions. The extracted features help to detect the emotion of the subjects in consideration accurately. The Snapshot ensemble technique is adopted with a Cosine annealing learning rate scheduler to improve the overall performance. The performance of the proposed model has been evaluated on a publicly available dataset, namely IRDatabase developed by RWTH Aachen University. The facial expressions present in the dataset are Fear, Anger, Contempt, Disgust, Happy, Neutral, Sad, and Surprise. The proposed model produces 88.43% recognition accuracy, better than some state-of-the-art methods considered here for comparison. Our model provides a robust framework for the detection of accurate expression in the absence of visible light. |
ArticleNumber | 20696 |
Author | Sen, Shibaprasad Sinitca, Aleksandr Bhattacharyya, Ankan Sarkar, Ram Kaplun, Dmitrii Chatterjee, Somnath |
Author_xml | – sequence: 1 givenname: Ankan surname: Bhattacharyya fullname: Bhattacharyya, Ankan organization: University of Kentucky – sequence: 2 givenname: Somnath surname: Chatterjee fullname: Chatterjee, Somnath organization: Computer Science and Engineering Department, Future Institute of Engineering and Management – sequence: 3 givenname: Shibaprasad surname: Sen fullname: Sen, Shibaprasad organization: Computer Science and Technology Department, University of Engineering and Management – sequence: 4 givenname: Aleksandr surname: Sinitca fullname: Sinitca, Aleksandr organization: Department of Automation and Control Processes, Saint Petersburg Electrotechnical University “LETI” – sequence: 5 givenname: Dmitrii surname: Kaplun fullname: Kaplun, Dmitrii email: dikaplun@etu.ru organization: Department of Automation and Control Processes, Saint Petersburg Electrotechnical University “LETI” – sequence: 6 givenname: Ram surname: Sarkar fullname: Sarkar, Ram organization: Department of Computer Science and Engineering, Jadavpur University |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/34667253$$D View this record in MEDLINE/PubMed |
BookMark | eNp9Ustu1TAQjVARLaU_wAJFYsMmxc_E3iBVFY9KldjAioVlO-NcXyX2xU4Q7dfj25S-Fp2NrZlzzow953V1EGKAqnqL0SlGVHzMDHMpGkRwI0uI5vpFdUQQ4w2hhBw8uB9WJzlvUQlOJMPyVXVIWdt2hNOj6tdZ3QPs6hF0Cj4M9RR7GGsXU21HnbN3V_vsZpl0qJ22Xo81_N0lKKUYcu1SnGofXNIJ-nreQJoKwk96gPymeun0mOHk9jyufn75_OP8W3P5_evF-dllY5kkc4OtE9JZ7ZDAxFiNLWe9lp1jxkrHGdGYWoN6jDtuTNd2WLaOGKOFLZTe0OPqYtXto96qXSrd05WK2qubREyD0mn2dgQFuCOko9B3bcswprIznMmeCMaNA82K1qdVa7eYCXoLYU56fCT6uBL8Rg3xjxKctC0SReDDrUCKvxfIs5p8tjCOOkBcsiJcMIQpbXGBvn8C3cYlhfJVexQlFImWF9S7hxPdjfJ_hwVAVoBNMecE7g6Ckdp7Ra1eUcUr6sYr6rqQxBOS9bOey07Lq_z4PJWu1Fz6hAHS_djPsP4B99PU9w |
CitedBy_id | crossref_primary_10_1038_s41598_023_35446_4 crossref_primary_10_2196_58760 crossref_primary_10_1007_s11042_023_15861_5 crossref_primary_10_1142_S0218001424510029 crossref_primary_10_1016_j_apsusc_2023_158672 crossref_primary_10_3390_e24050705 crossref_primary_10_5194_essd_14_4057_2022 crossref_primary_10_1038_s41598_023_36207_z crossref_primary_10_1109_ACCESS_2022_3217904 crossref_primary_10_1080_13682199_2023_2199504 crossref_primary_10_3390_mti6090078 crossref_primary_10_1109_TIM_2025_3548782 crossref_primary_10_1109_ACCESS_2024_3362247 crossref_primary_10_1109_ACCESS_2024_3383143 crossref_primary_10_1016_j_engappai_2024_109027 crossref_primary_10_32604_cmc_2024_047326 crossref_primary_10_1016_j_engappai_2022_105809 crossref_primary_10_7717_peerj_cs_2676 crossref_primary_10_1016_j_aei_2023_102292 crossref_primary_10_1007_s41870_023_01380_x crossref_primary_10_1016_j_eswa_2023_122266 crossref_primary_10_3390_machines11020183 crossref_primary_10_1615_CritRevBiomedEng_v51_i1_10 crossref_primary_10_3390_s22218242 |
Cites_doi | 10.1177/1754073914554783 10.3390/s19194135 10.1016/j.cogsys.2020.03.002 10.1016/j.ijar.2007.02.003 10.17485/IJST/v14i12.14 10.1016/S0921-8890(99)00103-7 10.1371/journal.pone.0212928 10.1109/tpami.2018.2884458 10.3390/s19132844 10.9734/jamcs/2020/v35i530279 10.3390/app10082924 10.3390/s19081863 10.1016/j.patcog.2009.07.007 10.1109/ICCV.2017.74 10.1109/ICDAR.2019.00178 10.1109/ROMAN.1997.647015 10.1007/978-3-642-33932-5_31 10.1109/CVPR.2016.90 10.1109/ROMAN.1994.365927 10.1007/s00371-020-02031-z 10.1109/I2MTC.2018.8409768 10.1117/12.2518708 10.1007/978-981-33-6987-0_32 |
ContentType | Journal Article |
Copyright | The Author(s) 2021 2021. The Author(s). The Author(s) 2021. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
Copyright_xml | – notice: The Author(s) 2021 – notice: 2021. The Author(s). – notice: The Author(s) 2021. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
DBID | C6C AAYXX CITATION CGR CUY CVF ECM EIF NPM 3V. 7X7 7XB 88A 88E 88I 8FE 8FH 8FI 8FJ 8FK ABUWG AEUYN AFKRA AZQEC BBNVY BENPR BHPHI CCPQU COVID DWQXO FYUFA GHDGH GNUQQ HCIFZ K9. LK8 M0S M1P M2P M7P PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQGLB PQQKQ PQUKI PRINS Q9U 7X8 5PM DOA |
DOI | 10.1038/s41598-021-99998-z |
DatabaseName | SpringerOpen Free (Free internet resource, activated by CARLI) CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed ProQuest Central (Corporate) Health & Medical Collection ProQuest Central (purchase pre-March 2016) Biology Database (Alumni Edition) Medical Database (Alumni Edition) Science Database (Alumni Edition) ProQuest SciTech Collection ProQuest Natural Science Journals Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest One Sustainability ProQuest Central UK/Ireland ProQuest Central Essentials Biological Science Collection ProQuest Central Natural Science Collection ProQuest One Coronavirus Research Database ProQuest Central Korea Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Central Student ProQuest SciTech Premium Collection ProQuest Health & Medical Complete (Alumni) Biological Sciences ProQuest Health & Medical Collection Medical Database Science Database Biological Science Database ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China ProQuest Central Basic MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Publicly Available Content Database ProQuest Central Student ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest One Health & Nursing ProQuest Natural Science Collection ProQuest Central China ProQuest Biology Journals (Alumni Edition) ProQuest Central ProQuest One Applied & Life Sciences ProQuest One Sustainability ProQuest Health & Medical Research Collection Health Research Premium Collection Health and Medicine Complete (Alumni Edition) Natural Science Collection ProQuest Central Korea Health & Medical Research Collection Biological Science Collection ProQuest Central (New) ProQuest Medical Library (Alumni) ProQuest Science Journals (Alumni Edition) ProQuest Biological Science Collection ProQuest Central Basic ProQuest Science Journals ProQuest One Academic Eastern Edition Coronavirus Research Database ProQuest Hospital Collection Health Research Premium Collection (Alumni) Biological Science Database ProQuest SciTech Collection ProQuest Hospital Collection (Alumni) ProQuest Health & Medical Complete ProQuest Medical Library ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic MEDLINE CrossRef Publicly Available Content Database |
Database_xml | – sequence: 1 dbid: C6C name: SpringerOpen Free (Free internet resource, activated by CARLI) url: http://www.springeropen.com/ sourceTypes: Publisher – sequence: 2 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 3 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 4 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 5 dbid: BENPR name: ProQuest Central url: https://www.proquest.com/central sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Biology |
EISSN | 2045-2322 |
EndPage | 17 |
ExternalDocumentID | oai_doaj_org_article_e172273ed766411397b549d2845bfea4 PMC8526608 34667253 10_1038_s41598_021_99998_z |
Genre | Research Support, Non-U.S. Gov't Journal Article |
GrantInformation_xml | – fundername: Russian Foundation for Basic Research grantid: 19-57-06007; 19-57-06007 funderid: http://dx.doi.org/10.13039/501100002261 – fundername: ; grantid: 19-57-06007; 19-57-06007 |
GroupedDBID | 0R~ 3V. 4.4 53G 5VS 7X7 88A 88E 88I 8FE 8FH 8FI 8FJ AAFWJ AAJSJ AAKDD ABDBF ABUWG ACGFS ACSMW ACUHS ADBBV ADRAZ AENEX AEUYN AFKRA AJTQC ALIPV ALMA_UNASSIGNED_HOLDINGS AOIJS AZQEC BAWUL BBNVY BCNDV BENPR BHPHI BPHCQ BVXVI C6C CCPQU DIK DWQXO EBD EBLON EBS ESX FYUFA GNUQQ GROUPED_DOAJ GX1 HCIFZ HH5 HMCUK HYE KQ8 LK8 M0L M1P M2P M48 M7P M~E NAO OK1 PIMPY PQQKQ PROAC PSQYO RNT RNTTT RPM SNYQT UKHRP AASML AAYXX AFPKN CITATION PHGZM PHGZT CGR CUY CVF ECM EIF NPM 7XB 8FK AARCD COVID K9. PJZUB PKEHL PPXIY PQEST PQGLB PQUKI PRINS Q9U 7X8 5PM PUEGO |
ID | FETCH-LOGICAL-c492t-1cf89fcaf0812bca1c54da97f4bc9f542a13cb0d1175bb767196f2bba8cf08db3 |
IEDL.DBID | M48 |
ISSN | 2045-2322 |
IngestDate | Wed Aug 27 01:24:08 EDT 2025 Thu Aug 21 13:49:17 EDT 2025 Wed Jul 30 11:30:03 EDT 2025 Wed Aug 13 10:59:32 EDT 2025 Thu Apr 03 07:05:11 EDT 2025 Tue Jul 01 01:33:33 EDT 2025 Thu Apr 24 23:05:33 EDT 2025 Fri Feb 21 02:39:17 EST 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 1 |
Language | English |
License | 2021. The Author(s). Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c492t-1cf89fcaf0812bca1c54da97f4bc9f542a13cb0d1175bb767196f2bba8cf08db3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
OpenAccessLink | https://www.proquest.com/docview/2583230865?pq-origsite=%requestingapplication% |
PMID | 34667253 |
PQID | 2583230865 |
PQPubID | 2041939 |
PageCount | 17 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_e172273ed766411397b549d2845bfea4 pubmedcentral_primary_oai_pubmedcentral_nih_gov_8526608 proquest_miscellaneous_2584013361 proquest_journals_2583230865 pubmed_primary_34667253 crossref_primary_10_1038_s41598_021_99998_z crossref_citationtrail_10_1038_s41598_021_99998_z springer_journals_10_1038_s41598_021_99998_z |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2021-10-19 |
PublicationDateYYYYMMDD | 2021-10-19 |
PublicationDate_xml | – month: 10 year: 2021 text: 2021-10-19 day: 19 |
PublicationDecade | 2020 |
PublicationPlace | London |
PublicationPlace_xml | – name: London – name: England |
PublicationTitle | Scientific reports |
PublicationTitleAbbrev | Sci Rep |
PublicationTitleAlternate | Sci Rep |
PublicationYear | 2021 |
Publisher | Nature Publishing Group UK Nature Publishing Group Nature Portfolio |
Publisher_xml | – name: Nature Publishing Group UK – name: Nature Publishing Group – name: Nature Portfolio |
References | Samadiani (CR43) 2019; 19 Filippini, Perpetuini, Cardone, Chiarelli, Merla (CR9) 2020; 10 Ekman, Rosenberg (CR3) 1997 Mase (CR6) 1991; E74 CR19 CR18 CR17 CR39 CR38 CR37 CR36 CR35 CR12 CR34 Khan, Ingleby, Ward (CR27) 2006; 1 CR32 Lien, Kanade, Cohn, Li (CR20) 2000; 31 Reddy, Savarni, Mukherjee (CR31) 2020; 62 Matsuno, Lee, Tsuji (CR7) 1994; I Ojo, Idowu (CR14) 2020; 35 Hammal, Covreur, Caplier, Rombout (CR13) 2007; 46 Bijalwan, Balodhi, Gusain (CR24) 2015; 5 Goulart, Valadão, Delisle-Rodriguez, Caldeira, Bastos (CR10) 2019; 14 Ali, Zhuang, Ibrahim (CR16) 2017; 9 Kopaczka, Breuer, Schock, Merhof (CR40) 2019; 19 Harsih Kamar, Akash, Gokul, Merhof (CR44) 2020; 1 Mehrabian (CR1) 1968; 2 CR5 CR8 CR29 Kyperountas, Tefas, Pitas (CR15) 2010; 43 CR28 Clay-Warner, Robinson (CR11) 2015; 7 Ekman, Friesen (CR2) 1978 Bodavarapu, Srinivas (CR30) 2021; 14 CR25 CR23 CR45 CR22 CR21 CR42 Panetta (CR41) 2020; 42 Li, Deng (CR33) 2020; 1 Harashima, Choi, Takebe (CR4) 1989; 4 Goulart (CR26) 2019; 19 S Li (99998_CR33) 2020; 1 P Bodavarapu (99998_CR30) 2021; 14 J Clay-Warner (99998_CR11) 2015; 7 H Harashima (99998_CR4) 1989; 4 99998_CR8 J Hammal (99998_CR13) 2007; 46 99998_CR12 99998_CR34 C Goulart (99998_CR26) 2019; 19 99998_CR32 A Mehrabian (99998_CR1) 1968; 2 K Mase (99998_CR6) 1991; E74 P Ekman (99998_CR2) 1978 99998_CR19 99998_CR17 99998_CR39 C Goulart (99998_CR10) 2019; 14 99998_CR18 MM Khan (99998_CR27) 2006; 1 99998_CR37 99998_CR38 99998_CR35 N Samadiani (99998_CR43) 2019; 19 RJ Harsih Kamar (99998_CR44) 2020; 1 99998_CR36 M Kyperountas (99998_CR15) 2010; 43 P Ekman (99998_CR3) 1997 G Reddy (99998_CR31) 2020; 62 V Bijalwan (99998_CR24) 2015; 5 99998_CR22 M Kopaczka (99998_CR40) 2019; 19 C Filippini (99998_CR9) 2020; 10 JJJ Lien (99998_CR20) 2000; 31 99998_CR23 99998_CR45 99998_CR42 99998_CR21 M Ali (99998_CR16) 2017; 9 99998_CR5 A Ojo (99998_CR14) 2020; 35 99998_CR28 99998_CR29 K Matsuno (99998_CR7) 1994; I 99998_CR25 K Panetta (99998_CR41) 2020; 42 |
References_xml | – ident: CR45 – ident: CR22 – ident: CR18 – year: 1997 ident: CR3 publication-title: What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS) – volume: 7 start-page: 157 year: 2015 end-page: 162 ident: CR11 article-title: Infrared thermography as a measure of emotion response publication-title: Emot. Rev. doi: 10.1177/1754073914554783 – volume: 9 start-page: 96 year: 2017 ident: CR16 article-title: An approach for facial expression classification publication-title: Int. J. Biom. – volume: 19 start-page: 4135 year: 2019 ident: CR40 article-title: A modular system for detection, tracking and analysis of human faces in thermal infrared recordings publication-title: Sensors doi: 10.3390/s19194135 – ident: CR39 – ident: CR37 – ident: CR12 – volume: 62 start-page: 23 year: 2020 end-page: 34 ident: CR31 article-title: Facial expression recognition in the wild, by fusion of deep learnt and hand-crafted features publication-title: Cogn. Syst. Res. doi: 10.1016/j.cogsys.2020.03.002 – volume: 46 start-page: 542 year: 2007 end-page: 567 ident: CR13 article-title: Facial expression classification: An approach based on the fusion of facial deformations using the transferable belief model publication-title: Int. J. Approx. Reason. doi: 10.1016/j.ijar.2007.02.003 – volume: 14 start-page: 971 year: 2021 end-page: 983 ident: CR30 article-title: Facial expression recognition for low resolution images using convolutional neural networks and denoising techniques publication-title: Indian J. Sci. Technol. doi: 10.17485/IJST/v14i12.14 – volume: 31 start-page: 131 issue: 3 year: 2000 end-page: 146 ident: CR20 article-title: Detection, tracking, and classification of action units in facial expression publication-title: Robot. Auton. Syst. doi: 10.1016/S0921-8890(99)00103-7 – ident: CR35 – ident: CR29 – year: 1978 ident: CR2 publication-title: Facial Action Coding System – volume: 14 start-page: e0212928 year: 2019 ident: CR10 article-title: Emotion analysis in children through facial emissivity of infrared thermal imaging publication-title: PLoS ONE doi: 10.1371/journal.pone.0212928 – volume: 1 start-page: 6535 year: 2020 end-page: 6548 ident: CR33 article-title: Deep facial expression recognition: A survey publication-title: IEEE Trans. Affect. Comput. – ident: CR8 – volume: 1 start-page: 91 year: 2006 end-page: 113 ident: CR27 article-title: Automated facial expression classification and affect interpretation using infrared measurement of facial skin temperature variations publication-title: Assoc. Comput. Mach. – volume: 42 start-page: 509 year: 2020 end-page: 520 ident: CR41 article-title: A comprehensive database for benchmarking imaging systems publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/tpami.2018.2884458 – ident: CR25 – ident: CR42 – ident: CR23 – volume: I start-page: 1591 issue: 8 year: 1994 end-page: 1600 ident: CR7 article-title: Recognition of facial expressions using potential net and kl expansion publication-title: Trans. IEICE J77-D-I – volume: 2 start-page: 53 year: 1968 end-page: 56 ident: CR1 article-title: Communication without words publication-title: Psychol. Today – ident: CR21 – volume: 4 start-page: 157 year: 1989 end-page: 166 ident: CR4 article-title: 3-d model-based synthesis of facial expressions and shape deformation publication-title: Hum. Interface – volume: 19 start-page: 2844 year: 2019 ident: CR26 article-title: Visual and thermal image processing for facial specific landmark detection to infer emotions in a child–robot interaction publication-title: MDPI Sens. doi: 10.3390/s19132844 – ident: CR19 – volume: 1 start-page: 30 year: 2020 end-page: 35 ident: CR44 article-title: Facial expression recognition system using multimodal sensors publication-title: Int. J. Multidiscip. Res. Sci., Eng. Technol – volume: 35 start-page: 22 issue: 5 year: 2020 end-page: 33 ident: CR14 article-title: Improved model for facial expression classification for fear and sadness using local binary pattern histogram publication-title: J. Adv. Math. Comput. Sci. doi: 10.9734/jamcs/2020/v35i530279 – volume: 10 start-page: 2924 year: 2020 ident: CR9 article-title: Thermal infrared imaging-based affective computing and its application to facilitate human robot interaction: A review publication-title: Appl. Sci. doi: 10.3390/app10082924 – ident: CR38 – ident: CR17 – volume: E74 start-page: 3474 issue: 10 year: 1991 end-page: 3483 ident: CR6 article-title: Recognition of facial expression from optical flow publication-title: Trans. IEICE – ident: CR32 – ident: CR34 – ident: CR36 – ident: CR5 – volume: 5 start-page: 34 issue: 1 year: 2015 end-page: 40 ident: CR24 article-title: Human emotion recognition using thermal image processing and eigenfaces publication-title: IJESR – ident: CR28 – volume: 19 start-page: 1863 year: 2019 ident: CR43 article-title: A review on automatic facial expression recognition systems assisted by multimodal sensor data publication-title: Sensors doi: 10.3390/s19081863 – volume: 43 start-page: 972 year: 2010 end-page: 986 ident: CR15 article-title: Salient feature and reliable classifier selection for facial expression classification publication-title: Pattern Recogn. doi: 10.1016/j.patcog.2009.07.007 – volume: 7 start-page: 157 year: 2015 ident: 99998_CR11 publication-title: Emot. Rev. doi: 10.1177/1754073914554783 – volume: 46 start-page: 542 year: 2007 ident: 99998_CR13 publication-title: Int. J. Approx. Reason. doi: 10.1016/j.ijar.2007.02.003 – volume: 1 start-page: 6535 year: 2020 ident: 99998_CR33 publication-title: IEEE Trans. Affect. Comput. – volume: 19 start-page: 1863 year: 2019 ident: 99998_CR43 publication-title: Sensors doi: 10.3390/s19081863 – volume: 14 start-page: e0212928 year: 2019 ident: 99998_CR10 publication-title: PLoS ONE doi: 10.1371/journal.pone.0212928 – volume: 43 start-page: 972 year: 2010 ident: 99998_CR15 publication-title: Pattern Recogn. doi: 10.1016/j.patcog.2009.07.007 – ident: 99998_CR39 doi: 10.1109/ICCV.2017.74 – volume: 1 start-page: 91 year: 2006 ident: 99998_CR27 publication-title: Assoc. Comput. Mach. – ident: 99998_CR35 – ident: 99998_CR18 doi: 10.1109/ICDAR.2019.00178 – ident: 99998_CR5 – ident: 99998_CR19 – ident: 99998_CR37 – volume: 1 start-page: 30 year: 2020 ident: 99998_CR44 publication-title: Int. J. Multidiscip. Res. Sci., Eng. Technol – ident: 99998_CR17 – ident: 99998_CR21 doi: 10.1109/ROMAN.1997.647015 – volume: 5 start-page: 34 issue: 1 year: 2015 ident: 99998_CR24 publication-title: IJESR – volume: E74 start-page: 3474 issue: 10 year: 1991 ident: 99998_CR6 publication-title: Trans. IEICE – ident: 99998_CR23 – ident: 99998_CR25 doi: 10.1007/978-3-642-33932-5_31 – volume-title: What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS) year: 1997 ident: 99998_CR3 – volume: 19 start-page: 4135 year: 2019 ident: 99998_CR40 publication-title: Sensors doi: 10.3390/s19194135 – ident: 99998_CR36 doi: 10.1109/CVPR.2016.90 – volume: 9 start-page: 96 year: 2017 ident: 99998_CR16 publication-title: Int. J. Biom. – ident: 99998_CR32 – volume: 31 start-page: 131 issue: 3 year: 2000 ident: 99998_CR20 publication-title: Robot. Auton. Syst. doi: 10.1016/S0921-8890(99)00103-7 – volume: 2 start-page: 53 year: 1968 ident: 99998_CR1 publication-title: Psychol. Today – volume: 19 start-page: 2844 year: 2019 ident: 99998_CR26 publication-title: MDPI Sens. doi: 10.3390/s19132844 – volume: 10 start-page: 2924 year: 2020 ident: 99998_CR9 publication-title: Appl. Sci. doi: 10.3390/app10082924 – volume: 35 start-page: 22 issue: 5 year: 2020 ident: 99998_CR14 publication-title: J. Adv. Math. Comput. Sci. doi: 10.9734/jamcs/2020/v35i530279 – volume: 62 start-page: 23 year: 2020 ident: 99998_CR31 publication-title: Cogn. Syst. Res. doi: 10.1016/j.cogsys.2020.03.002 – ident: 99998_CR8 doi: 10.1109/ROMAN.1994.365927 – ident: 99998_CR34 – ident: 99998_CR38 – ident: 99998_CR29 doi: 10.1007/s00371-020-02031-z – volume-title: Facial Action Coding System year: 1978 ident: 99998_CR2 – ident: 99998_CR12 doi: 10.1109/I2MTC.2018.8409768 – volume: 42 start-page: 509 year: 2020 ident: 99998_CR41 publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/tpami.2018.2884458 – volume: 4 start-page: 157 year: 1989 ident: 99998_CR4 publication-title: Hum. Interface – volume: 14 start-page: 971 year: 2021 ident: 99998_CR30 publication-title: Indian J. Sci. Technol. doi: 10.17485/IJST/v14i12.14 – ident: 99998_CR22 – ident: 99998_CR42 doi: 10.1117/12.2518708 – ident: 99998_CR45 – volume: I start-page: 1591 issue: 8 year: 1994 ident: 99998_CR7 publication-title: Trans. IEICE J77-D-I – ident: 99998_CR28 doi: 10.1007/978-981-33-6987-0_32 |
SSID | ssj0000529419 |
Score | 2.4797585 |
Snippet | The analysis of human facial expressions from the thermal images captured by the Infrared Thermal Imaging (IRTI) cameras has recently gained importance... Abstract The analysis of human facial expressions from the thermal images captured by the Infrared Thermal Imaging (IRTI) cameras has recently gained... |
SourceID | doaj pubmedcentral proquest pubmed crossref springer |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 20696 |
SubjectTerms | 631/1647/245 631/61/185 Cameras Cognition - physiology Cognitive ability Deep Learning Disease control Emotions - physiology Facial Expression Facial Recognition - physiology Female Humanities and Social Sciences Humans multidisciplinary Pattern recognition Science Science (multidisciplinary) Spectrophotometry, Infrared - methods |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1baxUxEA5SKPgi2npZrRLBN116ck8eq1hKQZ8sFHwIueoBuy09p6D99c5k9xx7vL74uklgyMxkvmxmviHkBTOzaB1zPWM595KnDD4nTV9c0oKFbHPC2uF37_XRiTw-Vac3Wn1hTthIDzxu3H6BCAshtmSjtWSIVyJcaTKcqirWEhoTKMS8G5epkdWbO8ncVCUzE3Z_AZEKq8kwI8FhXdn1RiRqhP2_Q5m_Jkv-9GLaAtHhXXJnQpD0YJT8HrlVhh2yPfaU_LZLPh7QXMoFndpBfKKt1w0FbEoTIuV5K2yirTcfrQH_mNPydUqHHRYU600omN0lZqZThIdnMGN-BufO4j45OXz74c1RP3VQ6JN0fNmzVK2rKVQI_DymwJKSOThTZUyuKskDEynOMvJ1xmi0AX-sPMZgEyzJUTwgW8P5UB4RyoMBKBDxlTFKJJFPtZTktM0QB6tKHWGr3fRpohfHLhdffHvmFtaPGvCgAd804K878nK95mIk1_jr7NeopPVMJMZuH8Bc_GQu_l_m0pG9lYr95K0LzxWcawIud6ojz9fD4Gf4eBKGcn7V5uBVVGjWkYejRawlEVJrw5XoiNmwlQ1RN0eG-efG5W0VIKSZ7cirlVX9EOvPW_H4f2zFE3Kboztgeo7bI1vLy6vyFBDWMj5rzvQdZbAhrQ priority: 102 providerName: Directory of Open Access Journals – databaseName: Health & Medical Collection dbid: 7X7 link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3daxQxEB-0IvgifrtaJYJvuvSym91snqSKpQj6ZOHAh5DPetDunbdX0P71zmRzW86Pvm4mkM3MZH7JfAG85nJmO8VVybn3paicR50TsgzKtTU3vvOOcoc_f2mPT8SneTPPD25DDqvcnonpoPZLR2_kB1WDslcjAG_erX6U1DWKvKu5hcZNuEWlyyikS87l9MZCXizBVc6VmdXdwYD2inLKKC5BUXbZ5Y49SmX7_4U1_w6Z_MNvmszR0T24m3EkOxwZfx9uhP4B3B47S_56CN8OmQ9hxXJTiFOWOt4wRKjMEV5epPQmljr0sWjo3ZyFnzkoth8YZZ0wFL41xaczAonnSLE4x9NneAQnRx-_fjgucx-F0glVbUruYqeiMxHNf2Wd4a4R3igZhXUqNqIyvHZ25qlqp7WylaiVsbLWdA6neFs_hr1-2YenwCojERBY8jVaQaXkXQzBqbbzaA1j4wrg293ULhcZp14XZzo5u-tOjxzQyAGdOKAvC3gzzVmNJTaupX5PTJooqTx2-rBcn-qsbTogLENcFrxsW8EJ5Fq8B3s0xY2NwYgC9rcs1llnB30lYQW8moZR28iFYvqwvEg0dCGtW17Ak1EippXUom1l1dQFyB1Z2Vnq7ki_-J4qencN4qRZV8DbrVRdLev_W_Hs-r94DncqEnQKv1H7sLdZX4QXiKA29mVSk9-pMxoi priority: 102 providerName: ProQuest – databaseName: SpringerOpen Free (Free internet resource, activated by CARLI) dbid: C6C link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3daxQxEB9qRfBF6vfaKhF808VLNpvdPNbDUgR9slDwIeSzHti9cneFtn-9M9kPOa2Cr7sTCDszmV92Zn4D8IY3M9dqrkvOQyil8AF9TjZl1F5V3IY2eOod_vxFHZ_IT6f16Q6IsRcmF-1nSst8TI_VYe_XGGioGYwKCjS1hd3cgbtE3U5WPVfz6b8KZa4k10N_zKxqb1m6FYMyVf9t-PLPMsnfcqU5BB3twYMBO7LDfrcPYSd2j-BeP03y-jF8O2Qhxgs2DII4Y3nKDUNUyjxh5EVuaWJ5Kh9Llv6Vs3g1FMJ2a0adJgwNbkU16YyA4TlKLM7xxFk_gZOjj1_nx-UwO6H0UotNyX1qdfI2YcgXzlvuaxmsbpJ0XqdaCssr72aBmDqda1SDnpiEc7b1uCS46insdssuPgcmbIMgwFF-0Umij_cpRq9VGzACptoXwMevafxALE7zLX6YnOCuWtNrwKAGTNaAuSng7bTmoqfV-Kf0B1LSJEmU2PnBcnVmBhMxEaEYYrEYGqUkJ2Dr8O4bMPzWLkUrCzgYVWwGP10bUeOJVuG1ri7g9fQaPYzSJraLy8ssQ5fQSvECnvUWMe2kkko1oq4KaLZsZWur22-6xffM4t3WiI1mbQHvRqv6ta2_f4oX_ye-D_cFGT6V4OgD2N2sLuNLRFEb9yq7zU9RyRiq priority: 102 providerName: Springer Nature |
Title | A deep learning model for classifying human facial expressions from infrared thermal images |
URI | https://link.springer.com/article/10.1038/s41598-021-99998-z https://www.ncbi.nlm.nih.gov/pubmed/34667253 https://www.proquest.com/docview/2583230865 https://www.proquest.com/docview/2584013361 https://pubmed.ncbi.nlm.nih.gov/PMC8526608 https://doaj.org/article/e172273ed766411397b549d2845bfea4 |
Volume | 11 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3da9swED_6wWAvY99z2wUN9rZ5i2TZsh7GSENLCbSMbYHAHoz14S7QOl2SQtu_fneynZEtG3sK2HIQujvfT9bd7wfwmqu-yTXXMefOxVJYhzEnVey1zRJeutxZ6h0-PctOxnI0SSdb0MkdtQu42Li1Iz2p8fzi3c2P248Y8B-alvH8_QKTEDWKUbGBppaxu23YxcykKFBPW7jfcH0LLYPWB5GwxwgmRNtHs_lv1nJVoPTfhEP_LKf87Uw1pKrjh_CgxZhs0DjFI9jy9WO416hO3j6BbwPmvL9irWDEOQtqOAzRK7OEpaeh9YkF9T5WlfRNnfmbtmC2XjDqSGG4VHOqXWcEIC9xxPQS30yLpzA-Pvo6PIlbjYXYSi2WMbdVritbVggNhLElt6l0pVaVNFZXqRQlT6zpO2L0NEZlCiO2EsaUucVHnEmewU49q_0LYKJUCBYMnUMaSTTztvLe6ix3mCmr1EbAu9UsbEtATjoYF0U4CE_yorFAgRYoggWKuwjerJ65aug3_jn6kIy0GknU2eHCbH5etJFYeIRsiNm8U1kmOQFgg3tkh2k6NZUvZQQHnYmLzh0LkeKbL8HtXxrBq9VtjEQ6XilrP7sOY2izmmQ8gueNR6xmksgsUyJNIlBrvrI21fU79fR7YPvOU8RQ_TyCt51X_ZrW35di7z-muQ_3BXk71efoA9hZzq_9S4RYS9ODbTVRPdgdDEZfRvh7eHT26TNeHWbDXvhs0QuR9RMETidd |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwEB6VrRBcEG8CBYwEJ4i6dpyHDwi10Kql7QqhVqrEwY0fKSvR7LK7FbQ_it_IjJNstTx663XjRF7PN57PnhfAS573TaG4ijl3LpbCOtQ5mcde2SzhpSucpdzhvUG2dSA_HqaHS_Cry4WhsMpuTwwbtRtZuiNfFSliL0ECnr4bf4-paxR5V7sWGg0sdvzZDzyyTd9uf0D5vhJic2P__VbcdhWIrVRiFnNbFaqyZYXGUBhbcptKV6q8ksaqKpWi5Ik1fUc1LI3JsxwxWgljysLiK84k-N1rsCwTPMr0YHl9Y_Dp8_xWh_xmkqs2O6efFKtTtJCUxUaREIry2c4XLGBoFPAvdvt3kOYfntpgADdvw62WubK1Bmp3YMnXd-F608vy7B58WWPO-zFr21Acs9BjhyEnZpYY-jAkVLHQE5BVJd3UM_-zDcOtp4zyXBjCfUIR8Yxo6QmOGJ7gfje9DwdXssYPoFePav8ImChzpCCGvJtGUvF6W3lvVVY4tL9VaiPg3Wpq25Y1p-4a33RwryeFbiSgUQI6SECfR_B6_s64Kepx6eh1EtJ8JBXkDj-MJse61W_tkQgiE_QuzzLJiVYbPHk7NP6pqXwpI1jpRKzbXWKqLzAdwYv5Y9RvctqUtR-dhjF0BE4yHsHDBhHzmSQyy3KRJhHkC1hZmOrik3r4NdQQL1JkZv0igjcdqi6m9f-leHz5v3gON7b293b17vZg5wncFAR6Cv5RK9CbTU79U-RvM_OsVRoGR1etp78BWN1aUQ |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwELZKEYgL4k2ggJHgBNGuHSeODwgVyqqlUHGg0koc3PhVVqLZZbMVtD-NX8eMk2y1PHrrdeNEXs83ns-eFyHPmByaUjGVMuZcKrh1oHNCpl7ZImOVK53F3OGPe8X2vng_zsdr5FefC4Nhlf2eGDdqN7V4Rz7gOWAvAwKeD0IXFvFpa_R69j3FDlLoae3babQQ2fUnP-D41rza2QJZP-d89O7z2-206zCQWqH4ImU2lCrYKoBh5MZWzObCVUoGYawKueAVy6wZOqxnaYwsJOA1cGOq0sIrzmTw3Uvkssxyhjomx3J5v4MeNMFUl6czzMpBA7YS89kwJkJhZtvpii2MLQP-xXP_Dtf8w2cbTeHoBrnecVi62YLuJlnz9S1ype1qeXKbfNmkzvsZ7RpSHNLYbYcCO6YWufokplbR2B2Qhgrv7Kn_2QXk1g3FjBcKwJ9jbDxFgnoEIyZHsPM1d8j-hazwXbJeT2t_n1BeSSAjBv2cRmAZexu8t6ooHVjikNuEsH41te0KnGOfjW86OtqzUrcS0CABHSWgTxPyYvnOrC3vce7oNyik5UgszR1_mM4Pdafp2gMlBE7onSwKwZBgGziDO6ABuQm-EgnZ6EWsu_2i0WfoTsjT5WPQdHTfVLWfHscxeBjOCpaQey0iljPJRFFInmcJkStYWZnq6pN68jVWEy9z4GjDMiEve1SdTev_S_Hg_H_xhFwF7dQfdvZ2H5JrHDGPUUBqg6wv5sf-ERC5hXkcNYaSg4tW0d-BZV0h |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+deep+learning+model+for+classifying+human+facial+expressions+from+infrared+thermal+images&rft.jtitle=Scientific+reports&rft.au=Bhattacharyya%2C+Ankan&rft.au=Chatterjee%2C+Somnath&rft.au=Sen%2C+Shibaprasad&rft.au=Sinitca%2C+Aleksandr&rft.date=2021-10-19&rft.issn=2045-2322&rft.eissn=2045-2322&rft.volume=11&rft.issue=1&rft.spage=20696&rft_id=info:doi/10.1038%2Fs41598-021-99998-z&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2045-2322&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2045-2322&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2045-2322&client=summon |