Efficient and High-Quality Monocular Depth Estimation via Gated Multi-Scale Network

The key issue in monocular depth estimation is how to construct the depth image better and improve the quality of the depth map. At present, most of the monocular depth estimation methods based on deep learning manipulate images at low resolution that leads to loss of detail and blurring of boundari...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 8; pp. 7709 - 7718
Main Authors Lin, Lixiong, Huang, Guohui, Chen, Yanjie, Zhang, Liwei, He, Bingwei
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract The key issue in monocular depth estimation is how to construct the depth image better and improve the quality of the depth map. At present, most of the monocular depth estimation methods based on deep learning manipulate images at low resolution that leads to loss of detail and blurring of boundaries. Nevertheless, deep learning with a large number of parameters needs highly computational complexity, which makes it difficult to apply high-resolution (HR) images to the depth estimate. In this work, model accuracy and runtime are two important factors to be considered. To improve the depth map quality and reduce the running time of the network, we introduce super-resolution techniques as methods of up-sampling to generate high-quality depth images at a faster rate for the depth estimation network. A novel approach is proposed for collecting high-level features that are captured under different receptive fields. The gated multi-scale decoder allows us to effectively filter information by the gated module. By combining the gated module to aid the super resolution of depth images, our method reduces memory consumption while improves reconstruction quality. Experiment results on the challenging NYU Depth v2 dataset demonstrate that both contributions provide significant performance gains over the state-of-the-art in self-supervised depth estimation.
AbstractList The key issue in monocular depth estimation is how to construct the depth image better and improve the quality of the depth map. At present, most of the monocular depth estimation methods based on deep learning manipulate images at low resolution that leads to loss of detail and blurring of boundaries. Nevertheless, deep learning with a large number of parameters needs highly computational complexity, which makes it difficult to apply high-resolution (HR) images to the depth estimate. In this work, model accuracy and runtime are two important factors to be considered. To improve the depth map quality and reduce the running time of the network, we introduce super-resolution techniques as methods of up-sampling to generate high-quality depth images at a faster rate for the depth estimation network. A novel approach is proposed for collecting high-level features that are captured under different receptive fields. The gated multi-scale decoder allows us to effectively filter information by the gated module. By combining the gated module to aid the super resolution of depth images, our method reduces memory consumption while improves reconstruction quality. Experiment results on the challenging NYU Depth v2 dataset demonstrate that both contributions provide significant performance gains over the state-of-the-art in self-supervised depth estimation.
Author Lin, Lixiong
Zhang, Liwei
He, Bingwei
Huang, Guohui
Chen, Yanjie
Author_xml – sequence: 1
  givenname: Lixiong
  orcidid: 0000-0002-9829-5358
  surname: Lin
  fullname: Lin, Lixiong
  email: linlixiong@126.com
  organization: School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, China
– sequence: 2
  givenname: Guohui
  surname: Huang
  fullname: Huang, Guohui
  organization: School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, China
– sequence: 3
  givenname: Yanjie
  surname: Chen
  fullname: Chen, Yanjie
  organization: School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, China
– sequence: 4
  givenname: Liwei
  orcidid: 0000-0003-3083-9002
  surname: Zhang
  fullname: Zhang, Liwei
  organization: School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, China
– sequence: 5
  givenname: Bingwei
  orcidid: 0000-0002-4386-8542
  surname: He
  fullname: He, Bingwei
  organization: School of Mechanical Engineering and Automation, Fuzhou University, Fuzhou, China
BookMark eNqFUUtv1DAQtlCRKKW_oBdLnLP4HedYLUtbqQWhhbM1a49bLyFeHKeo_560qSrEhbnMaDTfQ_O9JUdDHpCQM85WnLPuw_l6vdluV4IJthKdUa2Ur8ix4KZrpJbm6K_5DTkdxz2by84r3R6T7SbG5BMOlcIQ6GW6vWu-TtCn-kBv8pD91EOhH_FQ7-hmrOkn1JQHep-AXkDFQG-mvqZm66FH-hnr71x-vCOvI_Qjnj73E_L90-bb-rK5_nJxtT6_brxitjYKg4jAY8SAkaOP3oCyJtjQcb8zNoIN1rCovcBWsSAZVyCVD1ZwL8xOnpCrhTdk2LtDmc2VB5chuadFLrcOSk2-R6c5Uzx6JWJQKli0WgPzdsdbYdjOm5nr_cJ1KPnXhGN1-zyVYbbvhNKqtVwbOV_J5cqXPI4F44sqZ-4xDLeE4R7DcM9hzKjuH5RP9emPtUDq_4M9W7AJEV_UbKc5b1v5B9wcmX0
CODEN IAECCG
CitedBy_id crossref_primary_10_3390_s21010054
crossref_primary_10_1007_s00521_022_07663_x
crossref_primary_10_1145_3386569_3392420
crossref_primary_10_1109_ACCESS_2021_3076346
crossref_primary_10_1109_ACCESS_2020_3016008
Cites_doi 10.1109/CVPR.2018.00214
10.1109/ICCV.2015.336
10.1007/978-3-030-01237-3_32
10.1109/CVPR.2017.699
10.1109/CVPRW.2017.151
10.1109/CVPR.2018.00412
10.1109/CVPR.2018.00179
10.1016/j.media.2018.10.004
10.1109/CVPR.2009.5206848
10.1109/CVPR.2016.438
10.1109/CVPR.2017.106
10.1109/CVPR.2016.207
10.1109/ICCV.2017.365
10.1109/3DV.2016.32
10.1109/CVPR.2018.00212
10.1109/CVPR.2017.694
10.1109/CVPR.2018.00042
10.1109/CVPR.2018.00037
10.1109/CVPR.2016.182
10.1109/ICCV.2015.316
10.1109/CVPR.2016.90
10.1109/ICCV.2017.514
10.1109/ICRA.2018.8460184
10.1109/CVPR.2017.700
10.1109/ICCV.2015.169
10.1109/WACV.2016.7477595
10.1109/CVPR.2015.7298965
10.1109/ICCV.2015.304
10.1109/WACV.2019.00116
10.1109/ICRA.2019.8794182
10.1109/TPAMI.2019.2913372
10.1007/978-3-642-33715-4_54
10.1109/CVPR.2018.00474
10.1109/CVPR.2015.7299152
10.1109/CVPR.2018.00043
10.1109/CVPR.2017.196
10.1109/ICIP.2018.8451408
10.1109/CVPR.2018.00216
10.1109/CVPR.2017.243
10.1109/3DV.2018.00043
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
7SC
7SP
7SR
8BQ
8FD
JG9
JQ2
L7M
L~C
L~D
DOA
DOI 10.1109/ACCESS.2020.2964733
DatabaseName IEEE Xplore (IEEE)
Open Access资源_IEL Journals
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
METADEX
Technology Research Database
Materials Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
Materials Research Database
Engineered Materials Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
METADEX
Computer and Information Systems Abstracts Professional
DatabaseTitleList

Materials Research Database
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: RIE
  name: IEL
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 2169-3536
EndPage 7718
ExternalDocumentID oai_doaj_org_article_51041fc42fd44d8e855a0c8b17260bc6
10_1109_ACCESS_2020_2964733
8951177
Genre orig-research
GrantInformation_xml – fundername: Natural Science Foundation of Fujian Province
  grantid: 2019J05024
  funderid: 10.13039/501100003392
– fundername: Department of Education, Fujian Province
  grantid: JAT170091
  funderid: 10.13039/501100003410
GroupedDBID 0R~
4.4
5VS
6IK
97E
AAJGR
ABAZT
ABVLG
ACGFS
ADBBV
AGSQL
ALMA_UNASSIGNED_HOLDINGS
BCNDV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
ESBDL
GROUPED_DOAJ
IPLJI
JAVBF
KQ8
M43
M~E
O9-
OCL
OK1
RIA
RIE
RNS
AAYXX
CITATION
RIG
7SC
7SP
7SR
8BQ
8FD
JG9
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c408t-4ed2fa1ffedef1ecfc6a486d8d91cb68fa8d860f5c2e740d3014a34cd821c26b3
IEDL.DBID DOA
ISSN 2169-3536
IngestDate Wed Aug 27 01:23:26 EDT 2025
Mon Jun 30 03:50:24 EDT 2025
Thu Apr 24 22:56:46 EDT 2025
Tue Jul 01 01:22:03 EDT 2025
Wed Aug 27 02:41:49 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
License https://creativecommons.org/licenses/by/4.0/legalcode
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c408t-4ed2fa1ffedef1ecfc6a486d8d91cb68fa8d860f5c2e740d3014a34cd821c26b3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-4386-8542
0000-0003-3083-9002
0000-0002-9829-5358
OpenAccessLink https://doaj.org/article/51041fc42fd44d8e855a0c8b17260bc6
PQID 2454781563
PQPubID 4845423
PageCount 10
ParticipantIDs crossref_primary_10_1109_ACCESS_2020_2964733
ieee_primary_8951177
crossref_citationtrail_10_1109_ACCESS_2020_2964733
doaj_primary_oai_doaj_org_article_51041fc42fd44d8e855a0c8b17260bc6
proquest_journals_2454781563
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 20200000
2020-00-00
20200101
2020-01-01
PublicationDateYYYYMMDD 2020-01-01
PublicationDate_xml – year: 2020
  text: 20200000
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE access
PublicationTitleAbbrev Access
PublicationYear 2020
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref12
ref15
ref14
chen (ref35) 2017
ref52
ref11
ref10
kingma (ref49) 2015
ref17
ref16
ref19
ref18
ref51
ref50
ref45
ref48
ref47
ref42
ref41
ref8
dong (ref36) 2014
ref7
ref9
eigen (ref28) 2014
ref4
ref3
ref6
ref5
ref40
ref34
ref37
ref31
ref30
ref32
alhashim (ref44) 2018
stollenga (ref2) 2014
ref39
li (ref33) 2015
ref38
chakrabarti (ref29) 2016
ref24
ref23
ref25
ref20
ref22
paszke (ref46) 2017
ref21
ronneberger (ref43) 2015
singh (ref26) 2018
ref27
godard (ref1) 2018
References_xml – ident: ref5
  doi: 10.1109/CVPR.2018.00214
– ident: ref12
  doi: 10.1109/ICCV.2015.336
– year: 2017
  ident: ref46
  article-title: Automatic differentiation in PyTorch
  publication-title: Proc Adv Neural Inf Process Syst (NIPS)
– ident: ref24
  doi: 10.1007/978-3-030-01237-3_32
– ident: ref14
  doi: 10.1109/CVPR.2017.699
– start-page: 184
  year: 2014
  ident: ref36
  article-title: Learning a deep convolutional network for image super-resolution
  publication-title: Proc Eur Conf Comput Vis (ECCV)
– year: 2017
  ident: ref35
  article-title: Rethinking atrous convolution for semantic image segmentation
  publication-title: arXiv 1706 05587
– ident: ref23
  doi: 10.1109/CVPRW.2017.151
– ident: ref42
  doi: 10.1109/CVPR.2018.00412
– ident: ref40
  doi: 10.1109/CVPR.2018.00179
– ident: ref21
  doi: 10.1016/j.media.2018.10.004
– ident: ref48
  doi: 10.1109/CVPR.2009.5206848
– start-page: 2658
  year: 2016
  ident: ref29
  article-title: Depth from a single image by harmonizing overcomplete local network predictions
  publication-title: Proc Adv Neural Inf Process Syst (NIPS)
– ident: ref11
  doi: 10.1109/CVPR.2016.438
– start-page: 1
  year: 2015
  ident: ref49
  article-title: Adam: A method for stochastic optimization
  publication-title: Proc Int Conf Learn Represent (ICLR)
– start-page: 2366
  year: 2014
  ident: ref28
  article-title: Depth map prediction from a single image using a multi-scale deep network
  publication-title: Proc Adv Neural Inf Process Syst (NIPS)
– ident: ref25
  doi: 10.1109/CVPR.2017.106
– year: 2018
  ident: ref44
  article-title: High quality monocular depth estimation via transfer learning
  publication-title: arXiv 1812 11941
– ident: ref38
  doi: 10.1109/CVPR.2016.207
– ident: ref30
  doi: 10.1109/ICCV.2017.365
– ident: ref4
  doi: 10.1109/3DV.2016.32
– ident: ref16
  doi: 10.1109/CVPR.2018.00212
– ident: ref13
  doi: 10.1109/CVPR.2017.694
– ident: ref18
  doi: 10.1109/CVPR.2018.00042
– ident: ref32
  doi: 10.1109/CVPR.2018.00037
– ident: ref37
  doi: 10.1109/CVPR.2016.182
– ident: ref10
  doi: 10.1109/ICCV.2015.316
– ident: ref41
  doi: 10.1109/CVPR.2016.90
– ident: ref39
  doi: 10.1109/ICCV.2017.514
– ident: ref7
  doi: 10.1109/ICRA.2018.8460184
– ident: ref15
  doi: 10.1109/CVPR.2017.700
– ident: ref27
  doi: 10.1109/ICCV.2015.169
– ident: ref20
  doi: 10.1109/WACV.2016.7477595
– ident: ref22
  doi: 10.1109/CVPR.2015.7298965
– ident: ref19
  doi: 10.1109/ICCV.2015.304
– ident: ref6
  doi: 10.1109/WACV.2019.00116
– start-page: 234
  year: 2015
  ident: ref43
  article-title: U-net: Convolutional networks for biomedical image segmentation
  publication-title: Proc Int Conf Med Image Comput -Assist Intervent (MICCAI)
– ident: ref45
  doi: 10.1109/ICRA.2019.8794182
– ident: ref51
  doi: 10.1109/TPAMI.2019.2913372
– ident: ref47
  doi: 10.1007/978-3-642-33715-4_54
– year: 2018
  ident: ref1
  article-title: Digging into self-supervised monocular depth estimation
  publication-title: arXiv 1806 01260
– start-page: 9310
  year: 2018
  ident: ref26
  article-title: SNIPER: Efficient multi-scale training
  publication-title: Proc Adv Neural Inf Process Syst (NIPS)
– ident: ref52
  doi: 10.1109/CVPR.2018.00474
– ident: ref9
  doi: 10.1109/CVPR.2015.7299152
– year: 2014
  ident: ref2
  article-title: Deep networks with internal selective attention through feedback connections
  publication-title: Proc Adv Neural Inf Process Syst (NIPS)
– ident: ref8
  doi: 10.1109/CVPR.2018.00043
– ident: ref3
  doi: 10.1109/CVPR.2017.196
– ident: ref34
  doi: 10.1109/ICIP.2018.8451408
– ident: ref17
  doi: 10.1109/CVPR.2018.00216
– ident: ref50
  doi: 10.1109/CVPR.2017.243
– start-page: 1119
  year: 2015
  ident: ref33
  article-title: Depth and surface normal estimation from monocular images using regression on deep features and hierarchical CRFs
  publication-title: Proc IEEE Conf Comput Vis Pattern Recognit (CVPR)
– ident: ref31
  doi: 10.1109/3DV.2018.00043
SSID ssj0000816957
Score 2.2225654
Snippet The key issue in monocular depth estimation is how to construct the depth image better and improve the quality of the depth map. At present, most of the...
SourceID doaj
proquest
crossref
ieee
SourceType Open Website
Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 7709
SubjectTerms Blurring
Convolution
Decoding
Deep learning
Depth estimation
Estimation
Feature extraction
gated multi-scale network
Image manipulation
Image quality
Image reconstruction
Image resolution
Logic gates
Model accuracy
Modules
monocular vision
super resolution
SummonAdditionalLinks – databaseName: IEEE Electronic Library (IEL)
  dbid: RIE
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3JbhQxELWSnODAFhADCfKBYzzx1m73MUwmipCSS4iUm-WlLBBoJoIeJPL18TYtNiFurZbdqvbzUlWueoXQWydcUtOjI057SaQCIC4KINFaD2LQMqic73xxqc6v5fub7mYHHU25MABQgs9gnh_LXX5Y-012lR3rpA6wvt9Fu8lwq7lakz8lF5AYur4RCzE6HJ8sFukfkgnI6TxfLvZC_HL4FI7-VlTlj524HC9nj9HFVrAaVfJ5vhnd3N_9xtn4v5I_QY-anolP6sR4inZg9Qw9_Il9cB9dLQt9ROqI7SrgHPFBKqXGD5yW-rpEqOJTuB0_4mXaCWqSI_7-yeLscwu4JO-Sq4Qy4MsaTv4cXZ8tPyzOSauxQLykeiQSAo-WxQgBIgMfvbJSq6DDwLxTOlodtKKx8xx6SUO2wKyQPmjOPFdOvEB7q_UKXiLcO5f0DWl11FoyGQYqvE0aSmDcQeRqhvh28I1vBOS5DsYXUwwROpiKmMmImYbYDB1NnW4r_8a_m7_LqE5NM3l2eZHQMG0tmrQNSRa95DFIGTTorrPUa5d0OUWdT4LuZwSnjzTwZuhgO0dMW-jfDC-EaMkIFq_-3us1epAFrF6bA7Q3ft3AYdJjRvemTOB7Ed3wUA
  priority: 102
  providerName: IEEE
Title Efficient and High-Quality Monocular Depth Estimation via Gated Multi-Scale Network
URI https://ieeexplore.ieee.org/document/8951177
https://www.proquest.com/docview/2454781563
https://doaj.org/article/51041fc42fd44d8e855a0c8b17260bc6
Volume 8
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3JSgQxEA3iSQ_iiuNGDh6NZut09VHHERH0ooK3kBUFGUVHwb83SzuMCHrx2qSXVFVSr9JVrxDat8ImmB4tseAkkSoEYqMIJBrjguhAepXrnS-v1PmtvLhr7mZafeWcsEoPXAV3lGxGsugkj15KDwGaxlAHNjleRa0rZNvJ580EU2UPBqa6pu1phhjtjo6HwzSjFBByeph_NbZCfHNFhbG_b7HyY18uzuZsGS31KBEf169bQXNhvIoWZ7gD19D1qJA_JJ-BzdjjnK9BKiHGB04L9ankl-LT8Dy5x6O0jmuJIn5_MDifmHlcSm_JddJRwFc1GXwd3Z6NbobnpO-QQJykMCEyeB4NizH4EFlw0SkjQXnwHXNWQTTgQdHYOB5aSX2On4yQzgNnjisrNtD8-GkcNhFurU1oQRqIAJJJ31HhTMIXnnEbIlcDxL-EpV1PH567WDzqEkbQTlcJ6yxh3Ut4gA6mNz1X9ozfh59kLUyHZurrciEZhO4NQv9lEAO0lnU4fQgkDMnadoB2vnSq-2X6qnmhM0shrNj6j1dvo4U8nXpCs4PmJy9vYTdhlondK-a5V8oLPwFvJedk
linkProvider Directory of Open Access Journals
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV3NbtQwELZKOQAHCpSKhQI-wI1sE8dxnEMPZbvVlrZ7aSv1ZvwzVivQbkWzoPIsvArvxtjxRvyJWyVuUWRbjv1lPDOe-YaQV6Y0qKZ7kxlpecYFQGZ8CZnX2kLZSO5EyHc-morJKX93Vp2tkG99LgwAxOAzGIbHeJfv5nYRXGVbEtWBoq5TCOUBXH9BA-1qe38Xd_M1Y3vjk9EkSzUEMstz2WYcHPO68B4c-AKst0JzKZx0TWGNkF5LJ0XuK8ug5rkLFoYuuXWSFZYJU-K4t8ht1DMq1mWH9R6cULKiqepEZVTkzdbOaISrhkYny4fhOrMuy1-Ou1gVIJVx-UP2xwNtb418Xy5FF8fyYbhozdB-_Y0l8n9dqwfkftKk6U4H_YdkBWaPyL2f-BXXyfE4EmTgRKmeORpiWrKONOSaojCbxxhcuguX7Tkdo6zr0jjp5wtNg1fR0ZienB0jjoFOu4D5x-T0Rr5qg6zO5jN4QmhtDGpUXEsvJS-4a_LSatTBXMEMeCYGhC03W9lEsR4qfXxU0dTKG9UhRAWEqISQAXnTd7rsGEb-3fxtQFHfNNCDxxe4-ypJG4WClhfecuYd506CrCqdW2lQWxW5sTjR9YCYfpAElgHZXGJSJVF2pVikfEMzv3z6914vyZ3JydGhOtyfHjwjd8NkOx_VJlltPy3gOWptrXkRfx5K3t80An8AJrZRdA
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Efficient+and+High-Quality+Monocular+Depth+Estimation+via+Gated+Multi-Scale+Network&rft.jtitle=IEEE+access&rft.au=Lin%2C+Lixiong&rft.au=Huang%2C+Guohui&rft.au=Chen%2C+Yanjie&rft.au=Zhang%2C+Liwei&rft.date=2020&rft.issn=2169-3536&rft.eissn=2169-3536&rft.volume=8&rft.spage=7709&rft.epage=7718&rft_id=info:doi/10.1109%2FACCESS.2020.2964733&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_ACCESS_2020_2964733
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2169-3536&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2169-3536&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2169-3536&client=summon