Tinier-YOLO: A Real-Time Object Detection Method for Constrained Environments

Deep neural networks (DNNs) have shown prominent performance in the field of object detection. However, DNNs usually run on powerful devices with high computational ability and sufficient memory, which have greatly limited their deployment for constrained environments such as embedded devices. YOLO...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 8; pp. 1935 - 1944
Main Authors Fang, Wei, Wang, Lin, Ren, Peiming
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Deep neural networks (DNNs) have shown prominent performance in the field of object detection. However, DNNs usually run on powerful devices with high computational ability and sufficient memory, which have greatly limited their deployment for constrained environments such as embedded devices. YOLO is one of the state-of-the-art DNN-based object detection approaches with good performance both on speed and accuracy and Tiny-YOLO-V3 is its latest variant with a small model that can run on embedded devices. In this paper, Tinier-YOLO, which is originated from Tiny-YOLO-V3, is proposed to further shrink the model size while achieving improved detection accuracy and real-time performance. In Tinier-YOLO, the fire module in SqueezeNet is appointed by investigating the number of fire modules as well as their positions in the model in order to reduce the number of model parameters and then reduce the model size. For further improving the proposed Tinier-YOLO in terms of detection accuracy and real-time performance, the connectivity style between fire modules in Tinier-YOLO differs from SqueezeNet in that dense connection is introduced and fine designed to strengthen the feature propagation and ensure the maximum information flow in the network. The object detection performance is enhanced in Tinier-YOLO by using the passthrough layer that merges feature maps from the front layers to get fine-grained features, which can counter the negative effect of reducing the model size. The resulting Tinier-YOLO yields a model size of 8.9MB (almost 4× smaller than Tiny-YOLO-V3) while achieving 25 FPS real-time performance on Jetson TX1 and an mAP of 65.7% on PASCAL VOC and 34.0% on COCO. Tinier-YOLO alse posses comparable results in mAP and faster runtime speed with smaller model size and BFLOP/s value compared with other lightweight models like SqueezeNet SSD and MobileNet SSD.
AbstractList Deep neural networks (DNNs) have shown prominent performance in the field of object detection. However, DNNs usually run on powerful devices with high computational ability and sufficient memory, which have greatly limited their deployment for constrained environments such as embedded devices. YOLO is one of the state-of-the-art DNN-based object detection approaches with good performance both on speed and accuracy and Tiny-YOLO-V3 is its latest variant with a small model that can run on embedded devices. In this paper, Tinier-YOLO, which is originated from Tiny-YOLO-V3, is proposed to further shrink the model size while achieving improved detection accuracy and real-time performance. In Tinier-YOLO, the fire module in SqueezeNet is appointed by investigating the number of fire modules as well as their positions in the model in order to reduce the number of model parameters and then reduce the model size. For further improving the proposed Tinier-YOLO in terms of detection accuracy and real-time performance, the connectivity style between fire modules in Tinier-YOLO differs from SqueezeNet in that dense connection is introduced and fine designed to strengthen the feature propagation and ensure the maximum information flow in the network. The object detection performance is enhanced in Tinier-YOLO by using the passthrough layer that merges feature maps from the front layers to get fine-grained features, which can counter the negative effect of reducing the model size. The resulting Tinier-YOLO yields a model size of 8.9MB (almost 4× smaller than Tiny-YOLO-V3) while achieving 25 FPS real-time performance on Jetson TX1 and an mAP of 65.7% on PASCAL VOC and 34.0% on COCO. Tinier-YOLO alse posses comparable results in mAP and faster runtime speed with smaller model size and BFLOP/s value compared with other lightweight models like SqueezeNet SSD and MobileNet SSD.
Author Ren, Peiming
Wang, Lin
Fang, Wei
Author_xml – sequence: 1
  givenname: Wei
  orcidid: 0000-0001-8052-0994
  surname: Fang
  fullname: Fang, Wei
  email: fangwei@jiangnan.edu.cn
  organization: School of IoT Engineering, Jiangnan University, Wuxi, China
– sequence: 2
  givenname: Lin
  orcidid: 0000-0002-7779-5177
  surname: Wang
  fullname: Wang, Lin
  organization: School of IoT Engineering, Jiangnan University, Wuxi, China
– sequence: 3
  givenname: Peiming
  orcidid: 0000-0002-1186-2170
  surname: Ren
  fullname: Ren, Peiming
  organization: School of IoT Engineering, Jiangnan University, Wuxi, China
BookMark eNpNUV1r3DAQFCWF5usX5MXQZ1-1a8mW8nY41zZwx0FyfeiTkKV1o-NOSmUn0H9fXx1C92WWYWZ2YS7YWUyRGLsBvgDg-suybVePjwvkoBeoa9BSf2DnCLUuK1nVZ__tn9j1MOz5NGqiZHPONrsQA-Xy53a9vS2WxQPZQ7kLRyq23Z7cWNzROEFIsdjQ-JR80adctCkOY7Yhki9W8TXkFI8Ux-GKfeztYaDrN7xkP76udu33cr39dt8u16UTXI1lJ72zpABJoXK88V5oBBIWrEaNXIha9OBr2QvokDrnPKraOccr7ji66pLdz7k-2b15zuFo8x-TbDD_iJR_GZvH4A5kBCrludfCykYgyg4RVd9IXzedEtBPWZ_nrOecfr_QMJp9eslxet-gkEIhTLZJVc0ql9MwZOrfrwI3pxrMXIM51WDeaphcN7MrENG7Q2kBIKD6CwG3gw0
CODEN IAECCG
CitedBy_id crossref_primary_10_1109_JSEN_2024_3356356
crossref_primary_10_3390_s21082618
crossref_primary_10_1109_ACCESS_2022_3223374
crossref_primary_10_1007_s11042_021_11480_0
crossref_primary_10_1007_s00330_023_10184_3
crossref_primary_10_1109_TMM_2023_3274369
crossref_primary_10_1109_JSEN_2022_3154479
crossref_primary_10_3390_electronics11172748
crossref_primary_10_1016_j_eswa_2023_121036
crossref_primary_10_1109_ACCESS_2024_3349978
crossref_primary_10_3390_s21238069
crossref_primary_10_1007_s11227_020_03578_3
crossref_primary_10_1007_s12652_022_03897_8
crossref_primary_10_3390_electronics10151780
crossref_primary_10_3390_ani13152428
crossref_primary_10_3390_app112311229
crossref_primary_10_3390_electronics13010148
crossref_primary_10_1109_TII_2021_3114296
crossref_primary_10_3390_act13030081
crossref_primary_10_3390_wevj15040158
crossref_primary_10_1109_ACCESS_2020_3047071
crossref_primary_10_1177_03611981231170591
crossref_primary_10_3390_s23167112
crossref_primary_10_1109_TVLSI_2023_3305937
crossref_primary_10_1007_s11119_024_10150_z
crossref_primary_10_1109_TII_2021_3139348
crossref_primary_10_1002_sys_21606
crossref_primary_10_1007_s41870_022_00895_z
crossref_primary_10_1016_j_ijmultiphaseflow_2021_103593
crossref_primary_10_1109_JIOT_2022_3188518
crossref_primary_10_1109_ACCESS_2024_3386826
crossref_primary_10_3390_s20216205
crossref_primary_10_1051_itmconf_20224403053
crossref_primary_10_1038_s41598_023_46693_w
crossref_primary_10_1007_s11042_023_16852_2
crossref_primary_10_1016_j_neucom_2023_02_006
crossref_primary_10_2139_ssrn_4020403
crossref_primary_10_1016_j_bdr_2020_100182
crossref_primary_10_1007_s10489_023_04600_w
crossref_primary_10_2139_ssrn_4624204
crossref_primary_10_1007_s11554_023_01293_9
crossref_primary_10_3233_JIFS_232645
crossref_primary_10_1007_s11554_021_01170_3
crossref_primary_10_1088_1742_6596_2370_1_012029
crossref_primary_10_1007_s11554_023_01268_w
crossref_primary_10_1109_ACCESS_2021_3077499
crossref_primary_10_1109_ACCESS_2022_3221942
crossref_primary_10_1007_s42979_023_02131_2
crossref_primary_10_1109_ACCESS_2022_3174859
crossref_primary_10_1109_ACCESS_2021_3129474
crossref_primary_10_1016_j_bbrc_2021_05_073
crossref_primary_10_3390_jmse11030572
crossref_primary_10_1109_JBHI_2023_3271463
crossref_primary_10_3390_app112411957
crossref_primary_10_3390_electronics9060889
crossref_primary_10_1016_j_measurement_2021_109742
crossref_primary_10_1016_j_future_2022_04_018
crossref_primary_10_1007_s00779_021_01558_9
crossref_primary_10_1155_2022_2582687
crossref_primary_10_3390_app14062424
crossref_primary_10_3390_electronics12040877
crossref_primary_10_3390_en14051426
crossref_primary_10_1109_ACCESS_2022_3203443
crossref_primary_10_1016_j_aej_2021_11_027
crossref_primary_10_46604_aiti_2023_12682
crossref_primary_10_3390_s23042131
crossref_primary_10_3390_app14020731
crossref_primary_10_3390_fire6120446
crossref_primary_10_1016_j_ecoinf_2021_101485
crossref_primary_10_1016_j_matpr_2020_11_562
crossref_primary_10_1007_s12652_021_03580_4
crossref_primary_10_1007_s12652_021_03584_0
crossref_primary_10_3390_electronics13020420
crossref_primary_10_1016_j_psep_2022_06_037
crossref_primary_10_32604_iasc_2022_024890
crossref_primary_10_1007_s42979_024_02869_3
crossref_primary_10_3390_electronics12183907
crossref_primary_10_3390_sym15040951
crossref_primary_10_1007_s10055_023_00922_9
crossref_primary_10_1109_ACCESS_2021_3121309
crossref_primary_10_1109_LRA_2021_3125450
crossref_primary_10_3390_su141911930
crossref_primary_10_1007_s00521_021_06830_w
crossref_primary_10_1109_JSTARS_2022_3140776
crossref_primary_10_1016_j_compag_2024_109078
crossref_primary_10_3390_biomimetics7040163
crossref_primary_10_1016_j_asoc_2021_107610
crossref_primary_10_12677_MOS_2023_125441
crossref_primary_10_3390_wevj15030104
crossref_primary_10_3390_electronics12071609
crossref_primary_10_1109_ACCESS_2023_3298369
crossref_primary_10_35940_ijrte_D7951_1112423
crossref_primary_10_35784_jcsi_2693
crossref_primary_10_1088_1757_899X_1098_3_032076
crossref_primary_10_1145_3631406
crossref_primary_10_3390_app14030989
crossref_primary_10_3390_app12189331
crossref_primary_10_1145_3583074
Cites_doi 10.1007/s11263-009-0275-4
10.5244/C.31.76
10.1109/CVPR.2017.195
10.1109/CVPR.2018.00474
10.1007/978-3-319-10578-9_23
10.1007/978-3-319-46448-0_2
10.1109/CVPR.2015.7298594
10.1007/978-3-319-10602-1_48
10.1007/978-3-030-01264-9_8
10.1109/TPAMI.2009.167
10.1109/CVPR.2017.690
10.1109/CVPR.2016.308
10.1109/CVPR.2017.106
10.1109/ICCV.2017.324
10.1109/CVPR.2017.243
10.1109/CVPR.2014.81
10.1109/CVPR.2016.91
10.1109/ICCV.2015.169
10.1109/CVPR.2018.00716
10.1109/CVPR.2017.754
10.1109/CVPR.2017.351
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
7SC
7SP
7SR
8BQ
8FD
JG9
JQ2
L7M
L~C
L~D
DOA
DOI 10.1109/ACCESS.2019.2961959
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005-present
IEEE Xplore Open Access Journals
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library Online
CrossRef
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
METADEX
Technology Research Database
Materials Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Directory of Open Access Journals
DatabaseTitle CrossRef
Materials Research Database
Engineered Materials Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
METADEX
Computer and Information Systems Abstracts Professional
DatabaseTitleList Materials Research Database


Database_xml – sequence: 1
  dbid: DOA
  name: Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library Online
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 2169-3536
EndPage 1944
ExternalDocumentID oai_doaj_org_article_4288d0d94a574225b2228f75d67b841f
10_1109_ACCESS_2019_2961959
8941141
Genre orig-research
GrantInformation_xml – fundername: Key Research and Development Program of Jiangsu Province, China
  grantid: BE2017630
– fundername: China Postdoctoral Science Foundation
  grantid: 2014M560390
  funderid: 10.13039/501100002858
– fundername: National Basic Research Program of China (973 Program); National Key Research and Development Program of China
  grantid: 2017YFC1601000; 2017YFC1601800
  funderid: 10.13039/501100012166
– fundername: National Natural Science Foundation of China
  grantid: 61673194; 61672263
  funderid: 10.13039/501100001809
– fundername: Blue Project in Jiangsu Universities
GroupedDBID 0R~
4.4
5VS
6IK
97E
AAJGR
ACGFS
ADBBV
ALMA_UNASSIGNED_HOLDINGS
BCNDV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
ESBDL
GROUPED_DOAJ
IFIPE
IPLJI
JAVBF
KQ8
M43
M~E
O9-
OCL
OK1
RIA
RIE
RIG
RNS
AAYXX
CITATION
7SC
7SP
7SR
8BQ
8FD
JG9
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c408t-b5dcae812e828c07dd4921e4a1a929204464f1d65f41b2ebccd286ccc030c02c3
IEDL.DBID RIE
ISSN 2169-3536
IngestDate Tue Oct 22 15:15:44 EDT 2024
Thu Oct 10 18:18:20 EDT 2024
Fri Aug 23 03:24:20 EDT 2024
Mon Nov 04 12:02:50 EST 2024
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c408t-b5dcae812e828c07dd4921e4a1a929204464f1d65f41b2ebccd286ccc030c02c3
ORCID 0000-0002-1186-2170
0000-0001-8052-0994
0000-0002-7779-5177
OpenAccessLink https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/document/8941141
PQID 2454821422
PQPubID 4845423
PageCount 10
ParticipantIDs proquest_journals_2454821422
crossref_primary_10_1109_ACCESS_2019_2961959
doaj_primary_oai_doaj_org_article_4288d0d94a574225b2228f75d67b841f
ieee_primary_8941141
PublicationCentury 2000
PublicationDate 20200000
2020-00-00
20200101
2020-01-01
PublicationDateYYYYMMDD 2020-01-01
PublicationDate_xml – year: 2020
  text: 20200000
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE access
PublicationTitleAbbrev Access
PublicationYear 2020
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref35
krizhevsky (ref12) 2012
ref13
dauphin (ref38) 2013
ref14
iandola (ref31) 2016
dai (ref5) 2016
ref30
ref33
ref11
ref32
ref10
howard (ref36) 2019
ref2
zhu (ref18) 2019
ref1
ref17
tan (ref37) 2019
ref24
ref25
poole (ref40) 2016
ref42
ref41
ref43
duan (ref19) 2019
ioffe (ref27) 2015
ren (ref4) 2015
fu (ref15) 2018
courbariaux (ref21) 2015
ref28
han (ref20) 2015
hubara (ref23) 2017; 18
redmon (ref9) 2018
ref8
howard (ref34) 2017
ref7
li (ref16) 2017
ref3
szegedy (ref29) 2017; 4
ref6
ba (ref39) 2013
courbariaux (ref22) 2016
lin (ref26) 2013
References_xml – ident: ref41
  doi: 10.1007/s11263-009-0275-4
– ident: ref14
  doi: 10.5244/C.31.76
– ident: ref30
  doi: 10.1109/CVPR.2017.195
– start-page: 91
  year: 2015
  ident: ref4
  article-title: Faster R-CNN: Towards real-time object detection with region proposal networks
  publication-title: Proc Adv Neural Inf Process Syst
  contributor:
    fullname: ren
– ident: ref35
  doi: 10.1109/CVPR.2018.00474
– ident: ref2
  doi: 10.1007/978-3-319-10578-9_23
– ident: ref13
  doi: 10.1007/978-3-319-46448-0_2
– start-page: 3123
  year: 2015
  ident: ref21
  article-title: BinaryConnect: Training deep neural networks with binary weights during propagations
  publication-title: Proc Adv Neural Inf Process Syst
  contributor:
    fullname: courbariaux
– year: 2018
  ident: ref15
  article-title: DSSD: Deconvolutional single shot detector
  publication-title: arXiv 1701 06659
  contributor:
    fullname: fu
– ident: ref25
  doi: 10.1109/CVPR.2015.7298594
– ident: ref42
  doi: 10.1007/978-3-319-10602-1_48
– ident: ref33
  doi: 10.1007/978-3-030-01264-9_8
– year: 2019
  ident: ref36
  article-title: Searching for mobilenetv3
  publication-title: arXiv preprint arXiv 1905 00571
  contributor:
    fullname: howard
– year: 2019
  ident: ref18
  article-title: Feature selective anchor-free module for single-shot object detection
  publication-title: arXiv 1903 00621
  contributor:
    fullname: zhu
– ident: ref11
  doi: 10.1109/TPAMI.2009.167
– ident: ref8
  doi: 10.1109/CVPR.2017.690
– volume: 18
  start-page: 6869
  year: 2017
  ident: ref23
  article-title: Quantized neural networks: Training neural networks with low precision weights and activations
  publication-title: J Mach Learn Res
  contributor:
    fullname: hubara
– year: 2015
  ident: ref20
  article-title: Deep compression: Compressing deep neural networks with pruning, trained quantization and Huffman coding
  publication-title: arXiv 1510 00149 [cs]
  contributor:
    fullname: han
– ident: ref28
  doi: 10.1109/CVPR.2016.308
– ident: ref6
  doi: 10.1109/CVPR.2017.106
– year: 2018
  ident: ref9
  article-title: YOLOV3: An incremental improvement
  publication-title: arXiv 1804 02767
  contributor:
    fullname: redmon
– year: 2016
  ident: ref22
  article-title: Binarized neural networks: Training deep neural networks with weights and activations constrained to +1 or ?1
  publication-title: arXiv 1602 02830 [cs]
  contributor:
    fullname: courbariaux
– ident: ref17
  doi: 10.1109/ICCV.2017.324
– year: 2017
  ident: ref34
  article-title: MobileNets: Efficient convolutional neural networks for mobile vision applications
  publication-title: arXiv 1704 04861
  contributor:
    fullname: howard
– ident: ref10
  doi: 10.1109/CVPR.2017.243
– start-page: 1097
  year: 2012
  ident: ref12
  article-title: ImageNet classification with deep convolutional neural networks
  publication-title: Proc Adv Neural Inf Process Syst
  contributor:
    fullname: krizhevsky
– ident: ref1
  doi: 10.1109/CVPR.2014.81
– year: 2017
  ident: ref16
  article-title: FSSD: Feature fusion single shot multibox detector
  publication-title: arXiv 1712 00960
  contributor:
    fullname: li
– ident: ref7
  doi: 10.1109/CVPR.2016.91
– ident: ref3
  doi: 10.1109/ICCV.2015.169
– year: 2016
  ident: ref40
  article-title: Exponential expressivity in deep neural networks through transient chaos
  publication-title: Proc Adv Neural Inf Process Syst (NIPS)
  contributor:
    fullname: poole
– year: 2013
  ident: ref38
  article-title: Big neural networks waste capacity
  publication-title: arXiv 1301 3583
  contributor:
    fullname: dauphin
– ident: ref32
  doi: 10.1109/CVPR.2018.00716
– year: 2016
  ident: ref31
  article-title: SqueezeNet: AlexNet-level accuracy with 50 $\times$ fewer parameters and <0.5 MB model size
  publication-title: arXiv 1602 07360
  contributor:
    fullname: iandola
– volume: 4
  start-page: 12
  year: 2017
  ident: ref29
  article-title: Inception-V4, inception-resnet and the impact of residual connections on learning
  publication-title: Proc AAAI
  contributor:
    fullname: szegedy
– year: 2013
  ident: ref26
  article-title: Network in network
  publication-title: arXiv 1312 4400
  contributor:
    fullname: lin
– start-page: 2820
  year: 2019
  ident: ref37
  article-title: MnasNet: Platform-aware neural architecture search for mobile
  publication-title: Proc IEEE Conf Comput Vis Pattern Recognit
  contributor:
    fullname: tan
– ident: ref24
  doi: 10.1109/CVPR.2017.754
– year: 2013
  ident: ref39
  article-title: Do deep nets really need to be deep?
  publication-title: Proc Adv Neural Inf Process Syst (NIPS)
  contributor:
    fullname: ba
– ident: ref43
  doi: 10.1109/CVPR.2017.351
– start-page: 379
  year: 2016
  ident: ref5
  article-title: R-FCN: Object detection via region-based fully convolutional networks
  publication-title: Proc Adv Neural Inf Process Syst
  contributor:
    fullname: dai
– year: 2015
  ident: ref27
  article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift
  publication-title: arXiv 1502 03167
  contributor:
    fullname: ioffe
– year: 2019
  ident: ref19
  article-title: CenterNet: Keypoint triplets for object detection
  publication-title: arXiv 1904 08189
  contributor:
    fullname: duan
SSID ssj0000816957
Score 2.5591962
Snippet Deep neural networks (DNNs) have shown prominent performance in the field of object detection. However, DNNs usually run on powerful devices with high...
SourceID doaj
proquest
crossref
ieee
SourceType Open Website
Aggregation Database
Publisher
StartPage 1935
SubjectTerms Accuracy
Artificial neural networks
Computational modeling
Constrained environments
Convolution
dense connection
Detectors
Electronic devices
Embedded systems
Feature extraction
Feature maps
fire modules
Information flow
Model accuracy
Modules
Object detection
Object recognition
passthrough layer
Performance evaluation
Real time
Real-time systems
YOLO
SummonAdditionalLinks – databaseName: Directory of Open Access Journals
  dbid: DOA
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV09T8MwELUQEwwIKIhAQR4YCbWNE8dspbSqEKUSaqUyWcnZkVgCgvD_OTtpFcTAwpIhsfJxl_t4lv0eIZe5hFTIgsUWmi05KtYMRKxBKFASLAs827OndLqUD6tk1ZH68mvCGnrgxnADbI8zy6yWeYIoTiSFn7IoVWJTVWSSlyH7Mt0BUyEHZzzViWpphvD6YDga4Rf5tVz6WujUc6r8KEWBsb-VWPmVl0OxmeyTvbZLpMPm7Q7IlqsOyW6HO7BHZovXCkta_DJ_nN_SIX3Gji_2GzrovPBzK_Te1WGZVUVnQSWaYntKvT5nUIVwlo47e9yOyHIyXoymcauNEINkWR0XiYXcYXV2CJmAKWulFtzJnOfaC1AhypMlt2lSSl4IVwBYkaUAgEENTMDNMdmu3ip3QijPGQ6RzFmMTSHyTILyB4TNXp5KReRqbSbz3lBgmAAdmDaNVY23qmmtGpE7b8rNUM9fHU6gV03rVfOXVyPS847Y3CTTEoEbj0h_7RjTxtqnERJRl2eOE6f_8egzsiM8pg7TLH2yXX98uXNsPOriIvxj38vqzpw
  priority: 102
  providerName: Directory of Open Access Journals
Title Tinier-YOLO: A Real-Time Object Detection Method for Constrained Environments
URI https://ieeexplore.ieee.org/document/8941141
https://www.proquest.com/docview/2454821422
https://doaj.org/article/4288d0d94a574225b2228f75d67b841f
Volume 8
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT9wwEB5RTvRQKA91ecmHHsniGCeOuS1bEKq6rIRAgpMVj70SqpSt2uyFX8-Mk11R6KGXKIqcKPH4Md9k5vsAvtYaS6W9zAJ2JTkmsxJVZlEZNBqDTDzbk5vy-l5_fyge1uBkVQsTY0zJZ3HIp-lffpjjgkNlp5XV5L4T1vlgrO1qtVbxFBaQsIXpiYVyaU9H4zF9A2dv2aGyJbOo_LX5JI7-XlTl3UqctperTZgsX6zLKvk5XLR-iM9vOBv_98234FPvZ4pRNzA-w1pstuHjK_bBHZjcPTW0KWaP0x_TczESt-QzZlwSIqaeozPiW2xTolYjJklnWpCDK1jhM-lKxCAuX1XJ7cL91eXd-Drr1RUy1LJqM18ErCPt75FAF0oTgrYqj7rOa8sSVoQT9SwPZTHTuVfRIwZVlYhIywJKhWd7sN7Mm_gFRF5LaqJlDDS7laorjYYPBLxZ4MoM4GTZ7e5XR6LhEviQ1nVWcmwl11tpABdsmlVTZsBOF6hLXT-hHMGmKshgdV0QuleF51DWzBShNL7S-WwAO2yG1UN6CwzgcGlo18_WP05pwm3MPaf2_33XAWwoxtkp9HII6-3vRTwiZ6T1xwnEH6ex-ALmNdpY
link.rule.ids 315,783,787,799,867,2109,4031,27935,27936,27937,55086
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB5V7QE4lEdBLBTwgWOzdYwTx70tS6sFNl0JbaVysuKxV6qQUlSyl_76zjjZVXkcuERR5ESJx4_5JjPfB_C-0Vgq7WUWsC_JMZmVqDKLyqDRGGTi2a7Py9mF_nJZXO7A0bYWJsaYks_imE_Tv_xwjWsOlR1XVpP7Tlhnj_zqquyrtbYRFZaQsIUZqIVyaY8n0yl9Bedv2bGyJfOo_Lb9JJb-QVblr7U4bTBnj6HevFqfV_JjvO78GG__YG3833d_AvuDpykm_dB4CjuxfQaP7vEPHkC9vGppW8y-L-aLEzER38hrzLgoRCw8x2fEp9ilVK1W1ElpWpCLK1jjMylLxCBO79XJPYeLs9PldJYN-goZall1mS8CNpF2-EiwC6UJQVuVR93kjWURK0KKepWHsljp3KvoEYOqSkSkhQGlwg8vYLe9buNLEHkjqYmWMdD8VqqpNBo-EPRmiSszgqNNt7ufPY2GS_BDWtdbybGV3GClEXxk02ybMgd2ukBd6oYp5Qg4VUEGq5uC8L0qPAezVqYIpfGVzlcjOGAzbB8yWGAEhxtDu2G-_nJKE3Jj9jn16t93vYMHs2U9d_PP519fw0PFqDsFYg5ht7tZxzfkmnT-bRqRd1KS3K4
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Tinier-YOLO%3A+A+Real-Time+Object+Detection+Method+for+Constrained+Environments&rft.jtitle=IEEE+access&rft.au=Fang%2C+Wei&rft.au=Wang%2C+Lin&rft.au=Ren%2C+Peiming&rft.date=2020&rft.pub=IEEE&rft.eissn=2169-3536&rft.volume=8&rft.spage=1935&rft.epage=1944&rft_id=info:doi/10.1109%2FACCESS.2019.2961959&rft.externalDocID=8941141
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2169-3536&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2169-3536&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2169-3536&client=summon