Improved YOLOv3 model with feature map cropping for multi-scale road object detection
Road object detection is an essential and imperative step for driving intelligent vehicles. Generally, road objects, such as vehicles and pedestrians, present the characteristic of multi-scale and uncertain distribution which puts a high demand on the detection algorithm. Therefore, this paper propo...
Saved in:
Published in | Measurement science & technology Vol. 34; no. 4; p. 45406 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
01.04.2023
|
Online Access | Get full text |
Cover
Loading…
Abstract | Road object detection is an essential and imperative step for driving intelligent vehicles. Generally, road objects, such as vehicles and pedestrians, present the characteristic of multi-scale and uncertain distribution which puts a high demand on the detection algorithm. Therefore, this paper proposes a YOLOv3 (You Only Look Once v3)-based method aimed at enhancing the capability of cross-scale detection and focusing on the valuable area. The proposed method fills an urgent need for multi-scale detection, and its individual components will be useful in road object detection. The K-means-GIoU algorithm is designed to generate
a priori
boxes whose shapes are close to real boxes. This greatly reduces the complexity of training, paving the way for fast convergence. Then, a detection branch is added to detect small targets, and a feature map cropping module is introduced into the newly added detection branch to remove the areas with high probability of background targets and easy-to-detect targets, and the cropped areas of the feature map are filled with a value of 0. Further, a channel attention module and spatial attention module are added to strengthen the network’s attention to major regions. The experiment results on the KITTI dataset show that the proposed method maintains a fast detection speed and increases the mAP (mean average precision) value by as much as 2.86
%
compared with YOLOv3-ultralytics, and especially improves the detection performance for small-scale objects. |
---|---|
AbstractList | Road object detection is an essential and imperative step for driving intelligent vehicles. Generally, road objects, such as vehicles and pedestrians, present the characteristic of multi-scale and uncertain distribution which puts a high demand on the detection algorithm. Therefore, this paper proposes a YOLOv3 (You Only Look Once v3)-based method aimed at enhancing the capability of cross-scale detection and focusing on the valuable area. The proposed method fills an urgent need for multi-scale detection, and its individual components will be useful in road object detection. The K-means-GIoU algorithm is designed to generate
a priori
boxes whose shapes are close to real boxes. This greatly reduces the complexity of training, paving the way for fast convergence. Then, a detection branch is added to detect small targets, and a feature map cropping module is introduced into the newly added detection branch to remove the areas with high probability of background targets and easy-to-detect targets, and the cropped areas of the feature map are filled with a value of 0. Further, a channel attention module and spatial attention module are added to strengthen the network’s attention to major regions. The experiment results on the KITTI dataset show that the proposed method maintains a fast detection speed and increases the mAP (mean average precision) value by as much as 2.86
%
compared with YOLOv3-ultralytics, and especially improves the detection performance for small-scale objects. |
Author | Ni, Yuanzhi Wang, Yue Tao, Hongfeng Stojanovic, Vladimir Shen, Lingzhi |
Author_xml | – sequence: 1 givenname: Lingzhi surname: Shen fullname: Shen, Lingzhi – sequence: 2 givenname: Hongfeng orcidid: 0000-0001-5279-2458 surname: Tao fullname: Tao, Hongfeng – sequence: 3 givenname: Yuanzhi surname: Ni fullname: Ni, Yuanzhi – sequence: 4 givenname: Yue surname: Wang fullname: Wang, Yue – sequence: 5 givenname: Vladimir orcidid: 0000-0002-6005-2086 surname: Stojanovic fullname: Stojanovic, Vladimir |
BookMark | eNp1kM9LwzAYhoNMcJvePeYfqH5J2iY9yvDHYLCLO3gqafJFM9qmpNnE_96ViQfB0wsvPC8vz4LM-tAjIbcM7hgodc9EybKyAHavTQOyuCDz32pG5lAVMgMuxBVZjOMeACRU1Zzs1t0QwxEtfdtutkdBu2CxpZ8-fVCHOh0i0k4P1MQwDL5_py5E2h3a5LPR6BZpDNrS0OzRJGoxncKH_ppcOt2OePOTS7J7enxdvWSb7fN69bDJDM9FygR3uZQcEJjV3IjCSuMMZ0qaQqkCQedGVdg0srE5GM2dklXJmS6rSgBKsSTlefd0bxwjutr4pKcHKWrf1gzqSU49magnE_VZzgmEP-AQfafj1__IN8jOab0 |
CitedBy_id | crossref_primary_10_1016_j_engappai_2023_106488 crossref_primary_10_1088_1361_6501_ad8a80 crossref_primary_10_1016_j_dsp_2023_104283 crossref_primary_10_3390_app13063576 crossref_primary_10_1109_TITS_2024_3389945 crossref_primary_10_1016_j_engappai_2024_108700 crossref_primary_10_1007_s40747_023_01076_6 crossref_primary_10_1016_j_eswa_2024_123619 crossref_primary_10_1016_j_engappai_2023_106406 crossref_primary_10_1088_1361_6501_ace20a crossref_primary_10_1007_s40747_023_01117_0 crossref_primary_10_1007_s40747_023_01152_x crossref_primary_10_1007_s11554_025_01651_9 crossref_primary_10_1007_s00521_023_08741_4 crossref_primary_10_1016_j_patrec_2024_08_007 crossref_primary_10_1007_s00138_024_01611_6 crossref_primary_10_1016_j_dsp_2024_104798 crossref_primary_10_1016_j_engappai_2023_106656 crossref_primary_10_1016_j_engappai_2023_107508 crossref_primary_10_1088_1361_6501_ad9856 crossref_primary_10_1007_s40747_024_01448_6 crossref_primary_10_1016_j_measurement_2024_114617 crossref_primary_10_1007_s40747_023_01167_4 crossref_primary_10_3390_e26080645 crossref_primary_10_1088_1361_6501_ad7d29 crossref_primary_10_1088_1402_4896_ad6e3b crossref_primary_10_1016_j_neucom_2024_127862 crossref_primary_10_1088_1361_6501_ad6bae crossref_primary_10_1007_s40747_023_01102_7 crossref_primary_10_1016_j_patrec_2023_08_017 crossref_primary_10_1016_j_eswa_2023_123067 crossref_primary_10_1016_j_patrec_2024_06_027 crossref_primary_10_1016_j_ins_2023_119862 crossref_primary_10_1088_1361_6501_ad6fc2 crossref_primary_10_1007_s40747_024_01412_4 crossref_primary_10_1007_s40747_023_01079_3 crossref_primary_10_1007_s40747_023_01224_y crossref_primary_10_1007_s40747_024_01439_7 crossref_primary_10_1007_s10846_023_01831_4 crossref_primary_10_1016_j_neunet_2024_106231 crossref_primary_10_1109_ACCESS_2023_3317251 crossref_primary_10_1016_j_eswa_2023_121893 crossref_primary_10_1007_s40747_023_01158_5 crossref_primary_10_1007_s40747_024_01769_6 crossref_primary_10_1016_j_engappai_2023_106911 crossref_primary_10_1007_s00521_025_11153_1 crossref_primary_10_1016_j_measurement_2024_114313 crossref_primary_10_1016_j_patcog_2023_109990 crossref_primary_10_1016_j_knosys_2024_111614 crossref_primary_10_1016_j_patcog_2023_109631 crossref_primary_10_3390_biomimetics8060458 crossref_primary_10_1016_j_conengprac_2023_105786 crossref_primary_10_1007_s40747_023_01299_7 crossref_primary_10_1016_j_neucom_2025_129844 crossref_primary_10_2139_ssrn_5082217 crossref_primary_10_1016_j_neucom_2023_126655 crossref_primary_10_37391_ijeer_110443 crossref_primary_10_1007_s40747_024_01503_2 crossref_primary_10_1007_s00034_024_02633_1 crossref_primary_10_1007_s00521_023_09134_3 crossref_primary_10_1007_s00521_023_09195_4 crossref_primary_10_1016_j_patrec_2023_10_009 crossref_primary_10_3390_electronics12204272 crossref_primary_10_3390_rs15153863 crossref_primary_10_1007_s00371_024_03642_6 crossref_primary_10_1016_j_engappai_2023_106720 crossref_primary_10_1016_j_measurement_2024_114386 crossref_primary_10_1038_s41598_023_43458_3 crossref_primary_10_1088_1361_6501_ad3f3a crossref_primary_10_3390_s25010214 crossref_primary_10_1007_s40747_024_01415_1 crossref_primary_10_1016_j_eswa_2024_123935 crossref_primary_10_1016_j_measurement_2023_113459 crossref_primary_10_1038_s41598_024_76577_6 crossref_primary_10_1007_s40747_023_01111_6 crossref_primary_10_1016_j_eswa_2023_121638 crossref_primary_10_1007_s40747_023_01176_3 crossref_primary_10_3390_ijgi12110457 crossref_primary_10_1088_1361_6501_ad688b crossref_primary_10_1088_1361_6501_ad50f6 crossref_primary_10_1016_j_engappai_2024_108070 crossref_primary_10_1016_j_eswa_2024_124178 crossref_primary_10_1016_j_ins_2023_119458 crossref_primary_10_1016_j_ins_2023_119851 crossref_primary_10_1016_j_ins_2023_119972 crossref_primary_10_1007_s40747_023_01278_y crossref_primary_10_1088_1361_6501_ad6024 crossref_primary_10_1088_1361_6501_ada4c8 crossref_primary_10_1007_s10845_024_02478_0 crossref_primary_10_1016_j_eswa_2024_124171 crossref_primary_10_1007_s00138_023_01453_8 crossref_primary_10_1016_j_engappai_2024_109150 crossref_primary_10_1007_s10845_023_02176_3 crossref_primary_10_1007_s40747_023_01241_x crossref_primary_10_1016_j_jclepro_2024_144425 crossref_primary_10_1007_s00371_024_03705_8 crossref_primary_10_1088_1361_6501_ad0690 crossref_primary_10_1007_s40747_023_01055_x crossref_primary_10_1007_s40747_023_01207_z crossref_primary_10_1016_j_patrec_2024_02_012 crossref_primary_10_1007_s10846_023_01912_4 crossref_primary_10_3390_s24072302 crossref_primary_10_1016_j_neucom_2024_127433 crossref_primary_10_1016_j_ins_2023_119143 crossref_primary_10_1016_j_ins_2024_120420 crossref_primary_10_1016_j_engappai_2023_106390 crossref_primary_10_1016_j_ins_2024_120703 crossref_primary_10_1007_s40747_023_01294_y crossref_primary_10_1016_j_enganabound_2024_03_019 crossref_primary_10_1016_j_knosys_2023_110786 crossref_primary_10_3390_rs16101760 crossref_primary_10_1016_j_dsp_2024_104492 crossref_primary_10_1088_1361_6501_ad3296 crossref_primary_10_1016_j_engappai_2023_107767 crossref_primary_10_1109_TASE_2024_3376712 crossref_primary_10_1007_s10846_023_02033_8 crossref_primary_10_1007_s10846_023_02037_4 crossref_primary_10_1007_s40747_023_01311_0 crossref_primary_10_1016_j_neucom_2024_127685 crossref_primary_10_1007_s40747_023_01096_2 crossref_primary_10_1007_s40747_024_01350_1 crossref_primary_10_1007_s40747_023_01187_0 crossref_primary_10_1016_j_patcog_2023_109957 crossref_primary_10_1016_j_ins_2023_119155 |
Cites_doi | 10.1016/j.physa.2016.01.034 10.1109/TPAMI.2016.2577031 10.1016/j.compag.2019.01.012 10.1109/TNNLS.2018.2876865 10.1109/TNNLS.2021.3128968 10.1080/10298436.2020.1714047 10.1016/j.renene.2022.04.046 10.1109/ACCESS.2019.2922479 10.1177/0278364913491297 10.1016/j.ijleo.2022.169061 10.1109/TITS.2020.2991039 10.1007/s41095-019-0149-9 10.1088/1361-6501/ac6663 10.1088/1361-6501/ac0ca8 10.1109/JAS.2020.1003021 10.1016/j.asoc.2021.107846 10.1109/TPAMI.2018.2858826 |
ContentType | Journal Article |
DBID | AAYXX CITATION |
DOI | 10.1088/1361-6501/acb075 |
DatabaseName | CrossRef |
DatabaseTitle | CrossRef |
DatabaseTitleList | CrossRef |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Sciences (General) Physics |
EISSN | 1361-6501 |
ExternalDocumentID | 10_1088_1361_6501_acb075 |
GroupedDBID | -DZ -~X .DC 1JI 4.4 5B3 5GY 5PX 5VS 5ZH 7.M 7.Q AAGCD AAGID AAHTB AAJIO AAJKP AATNI AAYXX ABCXL ABHWH ABJNI ABPEJ ABQJV ABVAM ACAFW ACBEA ACGFO ACGFS ACHIP ADEQX AEFHF AENEX AFYNE AKPSB ALMA_UNASSIGNED_HOLDINGS AOAED ASPBG ATQHT AVWKF AZFZN CBCFC CEBXE CITATION CJUJL CRLBU CS3 DU5 EBS EDWGO EMSAF EPQRW EQZZN F5P IHE IJHAN IOP IZVLO KOT LAP N5L N9A P2P PJBAE R4D RIN RNS RO9 ROL RPA SY9 TAE TN5 TWZ W28 WH7 XPP YQT ZMT ~02 |
ID | FETCH-LOGICAL-c243t-32f47720e01da2c35d7cfc2187c5885e0a4c89ebb7bd40ca2f879621a69930e73 |
ISSN | 0957-0233 |
IngestDate | Thu Apr 24 23:03:38 EDT 2025 Tue Jul 01 03:54:23 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 4 |
Language | English |
LinkModel | OpenURL |
MergedId | FETCHMERGED-LOGICAL-c243t-32f47720e01da2c35d7cfc2187c5885e0a4c89ebb7bd40ca2f879621a69930e73 |
ORCID | 0000-0002-6005-2086 0000-0001-5279-2458 |
ParticipantIDs | crossref_citationtrail_10_1088_1361_6501_acb075 crossref_primary_10_1088_1361_6501_acb075 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-04-01 |
PublicationDateYYYYMMDD | 2023-04-01 |
PublicationDate_xml | – month: 04 year: 2023 text: 2023-04-01 day: 01 |
PublicationDecade | 2020 |
PublicationTitle | Measurement science & technology |
PublicationYear | 2023 |
References | Qin (mstacb075bib27) 2019; vol 33 Zhao (mstacb075bib29) 2020 Girshick (mstacb075bib9) 2015 Wang (mstacb075bib14) 2022 Borji (mstacb075bib6) 2019; 5 Panigrahi (mstacb075bib20) 2022; 260 Huang (mstacb075bib19) 2022; 33 Wang (mstacb075bib1) 2022 Zhao (mstacb075bib7) 2019; 30 Tommaso (mstacb075bib15) 2022; 193 Lang (mstacb075bib28) 2019 Li (mstacb075bib2) 2021; 22 Kerner Boris (mstacb075bib4) 2016; 450 Lin (mstacb075bib21) 2020; 42 Rezatofighi (mstacb075bib24) 2019 Du (mstacb075bib16) 2020; 22 Ma (mstacb075bib3) 2020; 7 Tang (mstacb075bib18) 2021; 32 Girshick (mstacb075bib8) 2014 Redmon (mstacb075bib12) 2016 Wang (mstacb075bib23) 2021; 112 Bochkovskiy (mstacb075bib26) 2020 Zou (mstacb075bib5) 2019 Redmon (mstacb075bib10) 2016 Kim (mstacb075bib22) 2019; 7 Geiger (mstacb075bib25) 2013; 32 Ren (mstacb075bib11) 2017; 39 Redmon (mstacb075bib13) 2018 Tian (mstacb075bib17) 2019; 157 |
References_xml | – volume: 450 start-page: 700 year: 2016 ident: mstacb075bib4 article-title: Failure of classical traffic flow theories: stochastic highway capacity and automatic driving publication-title: Physica A doi: 10.1016/j.physa.2016.01.034 – volume: 39 start-page: 1137 year: 2017 ident: mstacb075bib11 article-title: Faster R-CNN: towards real-time object detection with region proposal networks publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2016.2577031 – volume: 157 start-page: 417 year: 2019 ident: mstacb075bib17 article-title: Apple detection during different growth stages in orchards using the improved YOLO-V3 model publication-title: Comput. Electron. Agric. doi: 10.1016/j.compag.2019.01.012 – volume: 30 start-page: 3212 year: 2019 ident: mstacb075bib7 article-title: Object detection with deep learning: a review publication-title: IEEE Trans. Neural Netw. Learn. Syst. doi: 10.1109/TNNLS.2018.2876865 – start-page: pp 779 year: 2016 ident: mstacb075bib10 article-title: You only look once: unified, real-time object detection – volume: vol 33 start-page: pp 8851 year: 2019 ident: mstacb075bib27 article-title: MonoGRNet: a geometric reasoning network for monocular 3D object localization – year: 2022 ident: mstacb075bib1 article-title: A review of vehicle detection techniques for intelligent vehicles publication-title: IEEE Trans. Neural Netw. Learn. Syst. doi: 10.1109/TNNLS.2021.3128968 – start-page: pp 12697 year: 2019 ident: mstacb075bib28 article-title: PointPillars: fast encoders for object detection from point clouds – year: 2020 ident: mstacb075bib26 article-title: YOLOv4: optimal speed and accuracy of object detection – volume: 22 start-page: 1659 year: 2020 ident: mstacb075bib16 article-title: Pavement distress detection and classification based on YOLO network publication-title: Int. J. Pavement Eng. doi: 10.1080/10298436.2020.1714047 – start-page: pp 1 year: 2018 ident: mstacb075bib13 article-title: YOLOv3: an incremental improvement – volume: 193 start-page: 941 year: 2022 ident: mstacb075bib15 article-title: A multi-stage model based on YOLOv3 for defect detection in PV panels based on IR and visible imaging by unmanned aerial vehicle publication-title: Renew. Energy doi: 10.1016/j.renene.2022.04.046 – volume: 7 start-page: 78311 year: 2019 ident: mstacb075bib22 article-title: Multi-scale detector for accurate vehicle detection in traffic surveillance data publication-title: IEEE Access doi: 10.1109/ACCESS.2019.2922479 – start-page: pp 580 year: 2014 ident: mstacb075bib8 article-title: Rich feature hiearchies for accurate object detection and semantic segmentation – volume: 32 start-page: 1231 year: 2013 ident: mstacb075bib25 article-title: Vision meets robotics: The KITTI dataset publication-title: Int. J. Robot. Res. doi: 10.1177/0278364913491297 – volume: 260 year: 2022 ident: mstacb075bib20 article-title: MS-ML-SNYOLOv3: a robust lightweight modification of SqueezeNet based YOLOv3 for pedestrian detection publication-title: Optik doi: 10.1016/j.ijleo.2022.169061 – volume: 22 start-page: 6297 year: 2021 ident: mstacb075bib2 article-title: A theoretical foundation of intelligence testing and its application for intelligent vehicles publication-title: IEEE Trans. Intell. Transp. Syst. doi: 10.1109/TITS.2020.2991039 – year: 2019 ident: mstacb075bib5 article-title: Object detection in 20 years: a survey – volume: 5 start-page: 117 year: 2019 ident: mstacb075bib6 article-title: Salient object detection: a survey publication-title: Comput. Vis. Media doi: 10.1007/s41095-019-0149-9 – start-page: pp 779 year: 2016 ident: mstacb075bib12 article-title: YOLO9000: better, faster, stronger – start-page: pp 1440 year: 2015 ident: mstacb075bib9 article-title: Fast R-CNN – start-page: pp 658 year: 2019 ident: mstacb075bib24 article-title: Generalized intersection over union: a metric and a loss for bounding box regression – volume: 33 year: 2022 ident: mstacb075bib19 article-title: Lightweight edge-attention network for surface-defect detection of rubber seal rings publication-title: Meas. Sci. Technol. doi: 10.1088/1361-6501/ac6663 – year: 2022 ident: mstacb075bib14 article-title: YOLOv7: trainable bag-of-freebies sets new state-of-the-art for real-time object detectors – volume: 32 year: 2021 ident: mstacb075bib18 article-title: A strip steel surface defect detection method based on attention mechanism and multi-scale maxpooling publication-title: Meas. Sci. Technol. doi: 10.1088/1361-6501/ac0ca8 – volume: 7 start-page: 315 year: 2020 ident: mstacb075bib3 article-title: Artificial intelligence applications in the development of autonomous vehicles: a survey publication-title: IEEE/CAA J. Autom. Sin. doi: 10.1109/JAS.2020.1003021 – start-page: pp 93 year: 2020 ident: mstacb075bib29 article-title: Vehicle detection based on improved Yolov3 algorithm – volume: 112 year: 2021 ident: mstacb075bib23 article-title: An advanced YOLOv3 method for small-scale road object detection publication-title: Appl. Soft Comput. J. doi: 10.1016/j.asoc.2021.107846 – volume: 42 start-page: 318 year: 2020 ident: mstacb075bib21 article-title: Focal loss for dense object detection publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2018.2858826 |
SSID | ssj0007099 |
Score | 2.6710856 |
Snippet | Road object detection is an essential and imperative step for driving intelligent vehicles. Generally, road objects, such as vehicles and pedestrians, present... |
SourceID | crossref |
SourceType | Enrichment Source Index Database |
StartPage | 45406 |
Title | Improved YOLOv3 model with feature map cropping for multi-scale road object detection |
Volume | 34 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3Pa9swFBZpx6CXsnYb7X6hww4tRossy5Z9HGMjjHUptGHtydiS3BU2JzTODmV__J5-2NG6FtZeTDDKI_b78vTp03tPCL1lAkg0k4zIgqWE64SRgqaUNFUcSxVnSjRmR_foazaZ8c9n6dlo9DusLunqd_L61rqSh3gV7oFfTZXsPTw7GIUb8Bn8C1fwMFz_y8dOEQDKeD79Mv2VuGNtnLTaaNuxM_pZLSJzSteiz5i0GYRkCa4xqYWViua1kWIipTubldWGdPVorSBGfQGQwUr3jyB_4qs8YG17cf39ci0IWCl2Mm8vGu0nSbsLYkP_Cojpeuw3L12fr3SoRbAkSGHpRUVB4L4LWdqF1CSLCfDAOIy5XsC8DAUFG0BNQ8Ds1tAO4dCoDL01M4fJmrpjV_7uo31jfhuyDu1-e56XxkZpbJTOwgZ6xGCRYcs_p8fDPC5o4Ts1umfym9xgYTz8irGzEJCagJ2cPkHbflmB3zuM7KCRbnfRY5veK5e7aMeH8CU-8H3GD5-iWQ8f7OCDLXywgQ_28MEAH9zDBwN8cAAfbOCDHXzwAJ9naPbp4-mHCfGnbBDJeNKRhDUcnp5qGquKySRVQjYSmJ-QaZ6nmlZc5oWua1ErTmXFmlzA3zuuMqC2VIvkOdps563eQzjPgP2ZtOQ0bXgBY5XKYEHOeQbL4krwfTTuX1MpfQt6cxLKj_Iu1-yjw-EbC9d-5c6xL-4x9iXaWuP3Fdrsrlb6NbDLrn5jQfAHcTh1HQ |
linkProvider | IOP Publishing |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Improved+YOLOv3+model+with+feature+map+cropping+for+multi-scale+road+object+detection&rft.jtitle=Measurement+science+%26+technology&rft.au=Shen%2C+Lingzhi&rft.au=Tao%2C+Hongfeng&rft.au=Ni%2C+Yuanzhi&rft.au=Wang%2C+Yue&rft.date=2023-04-01&rft.issn=0957-0233&rft.eissn=1361-6501&rft.volume=34&rft.issue=4&rft.spage=45406&rft_id=info:doi/10.1088%2F1361-6501%2Facb075&rft.externalDBID=n%2Fa&rft.externalDocID=10_1088_1361_6501_acb075 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0957-0233&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0957-0233&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0957-0233&client=summon |