Decision-Tree-Initialized Dendritic Neuron Model for Fast and Accurate Data Classification
This work proposes a decision tree (DT)-based method for initializing a dendritic neuron model (DNM). Neural networks become larger and larger, thus consuming more and more computing resources. This calls for a strong need to prune neurons that do not contribute much to their network's output....
Saved in:
Published in | IEEE transaction on neural networks and learning systems Vol. 33; no. 9; pp. 4173 - 4183 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.09.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | This work proposes a decision tree (DT)-based method for initializing a dendritic neuron model (DNM). Neural networks become larger and larger, thus consuming more and more computing resources. This calls for a strong need to prune neurons that do not contribute much to their network's output. Pruning those with low contribution may lead to a loss of accuracy of DNM. Our proposed method is novel because 1) it can reduce the number of dendrites in DNM while improving training efficiency without affecting accuracy and 2) it can select proper initialization weight and threshold of neurons. The Adam algorithm is used to train DNM after its initialization with our proposed DT-based method. To verify its effectiveness, we apply it to seven benchmark datasets. The results show that decision-tree-initialized DNM is significantly better than the original DNM, k-nearest neighbor, support vector machine, back-propagation neural network, and DT classification methods. It exhibits the lowest model complexity and highest training speed without losing any accuracy. The interactions among attributes can also be observed in its dendritic neurons. |
---|---|
AbstractList | This work proposes a decision tree (DT)-based method for initializing a dendritic neuron model (DNM). Neural networks become larger and larger, thus consuming more and more computing resources. This calls for a strong need to prune neurons that do not contribute much to their network's output. Pruning those with low contribution may lead to a loss of accuracy of DNM. Our proposed method is novel because 1) it can reduce the number of dendrites in DNM while improving training efficiency without affecting accuracy and 2) it can select proper initialization weight and threshold of neurons. The Adam algorithm is used to train DNM after its initialization with our proposed DT-based method. To verify its effectiveness, we apply it to seven benchmark datasets. The results show that decision-tree-initialized DNM is significantly better than the original DNM, k-nearest neighbor, support vector machine, back-propagation neural network, and DT classification methods. It exhibits the lowest model complexity and highest training speed without losing any accuracy. The interactions among attributes can also be observed in its dendritic neurons. This work proposes a decision tree (DT)-based method for initializing a dendritic neuron model (DNM). Neural networks become larger and larger, thus consuming more and more computing resources. This calls for a strong need to prune neurons that do not contribute much to their network's output. Pruning those with low contribution may lead to a loss of accuracy of DNM. Our proposed method is novel because 1) it can reduce the number of dendrites in DNM while improving training efficiency without affecting accuracy and 2) it can select proper initialization weight and threshold of neurons. The Adam algorithm is used to train DNM after its initialization with our proposed DT-based method. To verify its effectiveness, we apply it to seven benchmark datasets. The results show that decision-tree-initialized DNM is significantly better than the original DNM, k-nearest neighbor, support vector machine, back-propagation neural network, and DT classification methods. It exhibits the lowest model complexity and highest training speed without losing any accuracy. The interactions among attributes can also be observed in its dendritic neurons.This work proposes a decision tree (DT)-based method for initializing a dendritic neuron model (DNM). Neural networks become larger and larger, thus consuming more and more computing resources. This calls for a strong need to prune neurons that do not contribute much to their network's output. Pruning those with low contribution may lead to a loss of accuracy of DNM. Our proposed method is novel because 1) it can reduce the number of dendrites in DNM while improving training efficiency without affecting accuracy and 2) it can select proper initialization weight and threshold of neurons. The Adam algorithm is used to train DNM after its initialization with our proposed DT-based method. To verify its effectiveness, we apply it to seven benchmark datasets. The results show that decision-tree-initialized DNM is significantly better than the original DNM, k-nearest neighbor, support vector machine, back-propagation neural network, and DT classification methods. It exhibits the lowest model complexity and highest training speed without losing any accuracy. The interactions among attributes can also be observed in its dendritic neurons. |
Author | Zhou, MengChu Luo, Xudong Abusorrah, Abdullah Huang, Lukui Wen, Xiaohao |
Author_xml | – sequence: 1 givenname: Xudong orcidid: 0000-0002-3650-8450 surname: Luo fullname: Luo, Xudong email: luoxudong@gxnu.edu.cn organization: Teachers College for Vocational and Technical Education, Guangxi Normal University, Guilin, China – sequence: 2 givenname: Xiaohao orcidid: 0000-0003-4368-1443 surname: Wen fullname: Wen, Xiaohao email: wenxiaohao@gxnu.edu.cn organization: Teachers College for Vocational and Technical Education, Guangxi Normal University, Guilin, China – sequence: 3 givenname: MengChu orcidid: 0000-0002-5408-8752 surname: Zhou fullname: Zhou, MengChu email: mengchu.zhou@njit.edu organization: Department of Electrical and Computer Engineering, New Jersey Institute of Technology, Newark, NJ, USA – sequence: 4 givenname: Abdullah orcidid: 0000-0001-8025-0453 surname: Abusorrah fullname: Abusorrah, Abdullah email: aabusorrah@kau.edu.sa organization: Center of Research Excellence in Renewable Energy and Power Systems and the Department of Electrical and Computer Engineering, King Abdulaziz University, Jeddah, Saudi Arabia – sequence: 5 givenname: Lukui surname: Huang fullname: Huang, Lukui email: lukui-hua59@tbs.tu.ac.th organization: Guangxi University of Finance and Economics, Nanning, China |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/33729951$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kUtrWzEQhUVJaR7NH2ghCLLJ5rp63Ie0DHbzANdd1IXSjZClEShcS4mku0h-fZXYySKLDoiR4DuHGZ1jdBBiAIS-UDKjlMhv69Vq-WvGCKMzTrpOSvoBHTHas4ZxIQ7e7sOfQ3Sa8x2p1ZOub-UndMj5wKTs6BH6uwDjs4-hWSeA5jb44vXon8DiBQSb6tPgFUwpBvwjWhixiwlf6VywDhZfGjMlXQAvdNF4PuqcvfNGl-r4GX10esxwuu8n6PfV9_X8pln-vL6dXy4b09KhNE5w64S2ph7OTGud5oa2jhPTi5ZRLgnZ9JoaOWyk7YSEzppBMkdgI4UAfoIudr73KT5MkIva-mxgHHWAOGXFOsJE3Vy2FT1_h97FKYU6nWIDEX3LO8Eqdbanps0WrLpPfqvTo3r9tQqIHWBSzDmBU8aXl51L0n5UlKjnjNRLRuo5I7XPqErZO-mr-39FX3ciDwBvAskF6XvK_wG88pvX |
CODEN | ITNNAL |
CitedBy_id | crossref_primary_10_1016_j_knosys_2024_111729 crossref_primary_10_3390_molecules28020809 crossref_primary_10_1007_s10666_023_09918_w crossref_primary_10_3390_sym14010011 crossref_primary_10_5312_wjo_v14_i10_741 crossref_primary_10_1039_D4AY01346H crossref_primary_10_1007_s10666_023_09931_z crossref_primary_10_1038_s41598_024_66979_x crossref_primary_10_1109_TCYB_2022_3173632 crossref_primary_10_1016_j_exger_2024_112535 crossref_primary_10_1016_j_ins_2024_121034 crossref_primary_10_1007_s00521_023_09299_x crossref_primary_10_3390_electronics12010094 crossref_primary_10_3934_mbe_2023328 crossref_primary_10_1080_00207543_2024_2448604 crossref_primary_10_1038_s41598_024_84895_y crossref_primary_10_1155_2023_7037124 crossref_primary_10_29407_intensif_v8i2_22280 crossref_primary_10_3390_app13116542 crossref_primary_10_1007_s10462_024_10790_7 crossref_primary_10_1016_j_knosys_2021_107536 crossref_primary_10_1109_JIOT_2024_3413181 crossref_primary_10_1109_TETCI_2024_3367819 crossref_primary_10_1109_JAS_2023_123648 crossref_primary_10_1109_TAI_2024_3416236 crossref_primary_10_1109_TASE_2024_3360476 crossref_primary_10_1155_2022_1815170 crossref_primary_10_53941_ijndi0101004 crossref_primary_10_1080_10255842_2025_2472013 crossref_primary_10_3390_s24061729 crossref_primary_10_1109_TNNLS_2021_3105901 crossref_primary_10_3390_electronics13193911 crossref_primary_10_1016_j_knosys_2024_111442 crossref_primary_10_1109_TNNLS_2021_3105905 crossref_primary_10_1109_TNNLS_2023_3290203 |
Cites_doi | 10.21437/Interspeech.2011-91 10.1561/9781601984616 10.1109/LGRS.2017.2771405 10.1214/aoms/1177704575 10.1109/JAS.2017.7510817 10.1109/TNNLS.2018.2869694 10.1109/TASE.2018.2865663 10.1016/j.knosys.2018.08.020 10.1007/978-3-642-27733-7_299-3 10.1109/TKDE.2009.191 10.1109/TCSS.2020.3001517 10.1016/j.csda.2009.04.009 10.1109/JAS.2019.1911447 10.1098/rstb.1982.0084 10.1109/TPAMI.2009.187 10.1109/TNNLS.2019.2935384 10.3389/fnins.2018.00435 10.1142/S0129065719500126 10.1109/ICCV.2017.155 10.1109/IJCNN.1989.118638 10.1038/nn.4301 10.1016/j.neucom.2020.01.036 10.1118/1.1429239 10.1109/TCSS.2019.2931186 10.1007/978-3-030-01237-3_12 10.1109/TITS.2019.2958741 10.1002/ecjc.1024 10.1109/TNN.2004.836241 10.3321/j.issn:0529-6579.2007.z1.029 10.1109/TMI.2016.2528162 10.2307/2530946 10.2991/iccnce.2013.121 10.1109/TMI.2014.2366792 10.1137/140990309 10.1109/JAS.2020.1003465 10.1109/JAS.2021.1003865 10.1587/transinf.2014EDP7418 10.1177/2472555218818756 10.1109/TNNLS.2018.2846646 10.1155/2019/8682124 10.1109/UIC-ATC.2017.8397411 10.1155/2019/7362931 10.1155/2018/9390410 10.1145/3236009 10.1016/j.neucom.2015.09.052 10.15388/Informatica.2004.078 10.1007/s10732-008-9080-4 10.1198/tech.2001.s629 10.1007/978-3-319-13972-2_8 10.4249/scholarpedia.1883 10.1109/TCYB.2016.2606104 10.1007/BF00116251 10.1007/s11222-009-9153-8 10.1109/PIC.2016.7949463 10.1109/TNNLS.2019.2944672 10.1007/BF01062525 10.1016/j.knosys.2016.05.031 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
DBID | 97E RIA RIE AAYXX CITATION NPM 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
DOI | 10.1109/TNNLS.2021.3055991 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Aluminium Industry Abstracts Biotechnology Research Abstracts Calcium & Calcified Tissue Abstracts Ceramic Abstracts Chemoreception Abstracts Computer and Information Systems Abstracts Corrosion Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts Materials Business File Mechanical & Transportation Engineering Abstracts Neurosciences Abstracts Solid State and Superconductivity Abstracts METADEX Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database Materials Research Database ProQuest Computer Science Collection Civil Engineering Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
DatabaseTitle | CrossRef PubMed Materials Research Database Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Materials Business File Aerospace Database Engineered Materials Abstracts Biotechnology Research Abstracts Chemoreception Abstracts Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Civil Engineering Abstracts Aluminium Industry Abstracts Electronics & Communications Abstracts Ceramic Abstracts Neurosciences Abstracts METADEX Biotechnology and BioEngineering Abstracts Computer and Information Systems Abstracts Professional Solid State and Superconductivity Abstracts Engineering Research Database Calcium & Calcified Tissue Abstracts Corrosion Abstracts MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic PubMed Materials Research Database |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Xplore url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science |
EISSN | 2162-2388 |
EndPage | 4183 |
ExternalDocumentID | 33729951 10_1109_TNNLS_2021_3055991 9380661 |
Genre | orig-research Journal Article |
GrantInformation_xml | – fundername: Guangxi Vocational Education Teaching Reform Research Project in 2020 grantid: GXGZJG2020B101 – fundername: Education and Teaching Reform Project of Guangxi Normal University in 2019 grantid: 2019JGB36 funderid: 10.13039/501100009007 – fundername: The Deanship of Scientific Research (DSR) at King Abdulaziz University grantid: RG-6-135-38 funderid: 10.13039/501100004054 – fundername: Research Foundation Capacity Improvement Project for Young and Middle-Aged Teachers in Guangxi Universities of China in 2020 grantid: 2020KY02029 |
GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACIWK ACPRK AENEX AFRAH AGQYO AGSQL AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD IFIPE IPLJI JAVBF M43 MS~ O9- OCL PQQKQ RIA RIE RNS AAYXX CITATION RIG NPM 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
ID | FETCH-LOGICAL-c417t-f83df8adc8ad32c4dfa3c14f30c684213900b6a1c97b9d589e5dc792f0eb988e3 |
IEDL.DBID | RIE |
ISSN | 2162-237X 2162-2388 |
IngestDate | Thu Jul 10 17:04:49 EDT 2025 Mon Jun 30 03:23:48 EDT 2025 Thu Apr 03 06:56:43 EDT 2025 Thu Apr 24 22:51:26 EDT 2025 Tue Jul 01 00:27:38 EDT 2025 Wed Aug 27 02:29:21 EDT 2025 |
IsPeerReviewed | false |
IsScholarly | true |
Issue | 9 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c417t-f83df8adc8ad32c4dfa3c14f30c684213900b6a1c97b9d589e5dc792f0eb988e3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0003-4368-1443 0000-0001-8025-0453 0000-0002-3650-8450 0000-0002-5408-8752 |
PMID | 33729951 |
PQID | 2708643582 |
PQPubID | 85436 |
PageCount | 11 |
ParticipantIDs | crossref_primary_10_1109_TNNLS_2021_3055991 proquest_miscellaneous_2502805694 pubmed_primary_33729951 ieee_primary_9380661 proquest_journals_2708643582 crossref_citationtrail_10_1109_TNNLS_2021_3055991 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-09-01 |
PublicationDateYYYYMMDD | 2022-09-01 |
PublicationDate_xml | – month: 09 year: 2022 text: 2022-09-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: Piscataway |
PublicationTitle | IEEE transaction on neural networks and learning systems |
PublicationTitleAbbrev | TNNLS |
PublicationTitleAlternate | IEEE Trans Neural Netw Learn Syst |
PublicationYear | 2022 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref57 ref56 ref14 Han (ref17) 2016; 56 ref58 ref53 ref52 ref10 Wen (ref6) Louizos (ref13) 2017 Hassibi (ref15); 5 Fawcett (ref66) 2009; 31 Zhang (ref9) 2018; 2 Qian (ref38) 2019; 2019 ref46 ref45 ref48 ref47 ref42 ref41 Guo (ref3) Ullrich (ref12) 2017 ref44 ref43 ref7 Min (ref8) 2018 ref5 ref40 Bengio (ref19) 2010; 11 ref80 ref35 ref79 ref34 Wu (ref51) 2019 ref78 ref37 ref36 ref31 ref75 ref74 ref33 ref77 ref32 ref76 Molchanov (ref18) 2016 Demšar (ref63) 2006; 7 ref1 ref39 Feiqiong (ref30) 2019 Kingma (ref55) 2014 ref71 ref70 ref73 Suzuki (ref11) ref72 Murdoch (ref50) 2019; 116 Luo (ref2) 2017 ref24 ref68 ref23 ref67 ref26 Frosst (ref54) 2017 ref25 Ancona (ref49) 2017 ref69 ref20 ref64 ref22 ref21 ref65 ref28 Cun (ref16) ref27 ref29 ref60 ref62 Merz (ref59) 1998 ref61 Han (ref4) |
References_xml | – volume: 5 start-page: 164 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref15 article-title: Second order derivatives for network pruning: Optimal brain surgeon – ident: ref24 doi: 10.21437/Interspeech.2011-91 – start-page: 2074 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref6 article-title: Learning structured sparsity in deep neural networks – volume-title: arXiv:1611.06440 year: 2016 ident: ref18 article-title: Pruning convolutional neural networks for resource efficient inference – ident: ref10 doi: 10.1561/9781601984616 – ident: ref28 doi: 10.1109/LGRS.2017.2771405 – ident: ref70 doi: 10.1214/aoms/1177704575 – ident: ref76 doi: 10.1109/JAS.2017.7510817 – ident: ref32 doi: 10.1109/TNNLS.2018.2869694 – ident: ref22 doi: 10.1109/TASE.2018.2865663 – ident: ref39 doi: 10.1016/j.knosys.2018.08.020 – ident: ref60 doi: 10.1007/978-3-642-27733-7_299-3 – ident: ref20 doi: 10.1109/TKDE.2009.191 – start-page: 598 volume-title: Proc. Int. Conf. Neural Inf. Process. Syst. ident: ref16 article-title: Optimal brain damage – ident: ref23 doi: 10.1109/TCSS.2020.3001517 – start-page: 392 volume-title: Proc. Int. Conf. Mach. Learn. ident: ref11 article-title: Dual averaging and proximal gradient descent for Online alternating direction multiplier method – start-page: 1135 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref4 article-title: Learning both weights and connections for efficient neural network – ident: ref64 doi: 10.1016/j.csda.2009.04.009 – volume: 31 start-page: 1 issue: 8 year: 2009 ident: ref66 article-title: ROC graphs: Notes and practical considerations for data mining researchers publication-title: Mach. Learn. – ident: ref75 doi: 10.1109/JAS.2019.1911447 – ident: ref34 doi: 10.1098/rstb.1982.0084 – ident: ref65 doi: 10.1109/TPAMI.2009.187 – ident: ref21 doi: 10.1109/TNNLS.2019.2935384 – ident: ref29 doi: 10.3389/fnins.2018.00435 – ident: ref40 doi: 10.1142/S0129065719500126 – ident: ref7 doi: 10.1109/ICCV.2017.155 – volume-title: arXiv:1908.04494 year: 2019 ident: ref51 article-title: Regional tree regularization for interpretability in black box models – ident: ref62 doi: 10.1109/IJCNN.1989.118638 – ident: ref1 doi: 10.1038/nn.4301 – ident: ref53 doi: 10.1016/j.neucom.2020.01.036 – ident: ref67 doi: 10.1118/1.1429239 – ident: ref78 doi: 10.1109/TCSS.2019.2931186 – ident: ref5 doi: 10.1007/978-3-030-01237-3_12 – ident: ref80 doi: 10.1109/TITS.2019.2958741 – volume-title: arXiv:1412.6980 year: 2014 ident: ref55 article-title: Adam: A method for stochastic optimization – ident: ref35 doi: 10.1002/ecjc.1024 – volume-title: arXiv:1711.09784 year: 2017 ident: ref54 article-title: Distilling a neural network into a soft decision tree – ident: ref14 doi: 10.1109/TNN.2004.836241 – start-page: 15 issue: 2 year: 2019 ident: ref30 article-title: On suitability of online product sales prediction model based on convolutional neural networks publication-title: J. Northwest Minzu Univ., Philosophy Social Sci. – ident: ref48 doi: 10.3321/j.issn:0529-6579.2007.z1.029 – ident: ref27 doi: 10.1109/TMI.2016.2528162 – ident: ref57 doi: 10.2307/2530946 – volume: 2 start-page: 3 volume-title: arXiv:1807.11091 year: 2018 ident: ref9 article-title: ADAM-ADMM: A unified, systematic framework of structured weight pruning for DNNs – ident: ref72 doi: 10.2991/iccnce.2013.121 – ident: ref26 doi: 10.1109/TMI.2014.2366792 – ident: ref73 doi: 10.1137/140990309 – ident: ref79 doi: 10.1109/JAS.2020.1003465 – ident: ref77 doi: 10.1109/JAS.2021.1003865 – ident: ref41 doi: 10.1587/transinf.2014EDP7418 – volume-title: arXiv:1711.06104 year: 2017 ident: ref49 article-title: Towards better understanding of gradient-based attribution methods for deep neural networks – ident: ref31 doi: 10.1177/2472555218818756 – volume-title: arXiv:1706.05791 year: 2017 ident: ref2 article-title: An entropy-based pruning method for CNN compression – volume-title: UCI Repository of Machine Learning Databases year: 1998 ident: ref59 – ident: ref36 doi: 10.1109/TNNLS.2018.2846646 – ident: ref42 doi: 10.1155/2019/8682124 – ident: ref46 doi: 10.1109/UIC-ATC.2017.8397411 – volume: 116 start-page: 22071 issue: 44 year: 2019 ident: ref50 article-title: Interpretable machine learning: Definitions, methods, and applications publication-title: Neurocomputing – volume-title: arXiv:1712.01312 year: 2017 ident: ref13 article-title: Learning sparse neural networks through L₀ regularization – volume: 2019 start-page: 1 year: 2019 ident: ref38 article-title: MrDNM: A novel mutual information-based dendritic neuron model publication-title: Comput. Intell. Neurosci. doi: 10.1155/2019/7362931 – ident: ref43 doi: 10.1155/2018/9390410 – volume: 7 start-page: 1 issue: 1 year: 2006 ident: ref63 article-title: Statistical comparisons of classifiers over multiple data sets publication-title: J. Mach. Learn. Res. – ident: ref47 doi: 10.1145/3236009 – ident: ref37 doi: 10.1016/j.neucom.2015.09.052 – volume: 56 start-page: 3 issue: 4 year: 2016 ident: ref17 article-title: Deep compression: Compressing deep neural networks with pruning, trained quantization and Huffman coding publication-title: Fiber – ident: ref33 doi: 10.15388/Informatica.2004.078 – ident: ref69 doi: 10.1007/s10732-008-9080-4 – volume-title: arXiv:1809.02220 year: 2018 ident: ref8 article-title: 2PFPCE: Two-phase filter pruning based on conditional entropy – ident: ref71 doi: 10.1198/tech.2001.s629 – ident: ref25 doi: 10.1007/978-3-319-13972-2_8 – ident: ref61 doi: 10.4249/scholarpedia.1883 – volume: 11 start-page: 625 issue: 3 year: 2010 ident: ref19 article-title: Why does unsupervised pre-training help deep learning? publication-title: J. Mach. Learn. Res. – ident: ref74 doi: 10.1109/TCYB.2016.2606104 – ident: ref56 doi: 10.1007/BF00116251 – ident: ref58 doi: 10.1007/s11222-009-9153-8 – volume-title: arXiv:1702.04008 year: 2017 ident: ref12 article-title: Soft weight-sharing for neural network compression – ident: ref45 doi: 10.1109/PIC.2016.7949463 – ident: ref52 doi: 10.1109/TNNLS.2019.2944672 – start-page: 1379 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref3 article-title: Dynamic network surgery for efficient DNNs – ident: ref68 doi: 10.1007/BF01062525 – ident: ref44 doi: 10.1016/j.knosys.2016.05.031 |
SSID | ssj0000605649 |
Score | 2.588696 |
Snippet | This work proposes a decision tree (DT)-based method for initializing a dendritic neuron model (DNM). Neural networks become larger and larger, thus consuming... |
SourceID | proquest pubmed crossref ieee |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 4173 |
SubjectTerms | Accuracy Algorithms Back propagation networks Biological neural networks Biomembranes Classification Computational modeling decision tree (DT) Decision trees Dendrites dendritic neuron model (DNM) Dendritic structure machine learning neural network Neural networks Neurons Sparse matrices Support vector machines Synapses Training |
Title | Decision-Tree-Initialized Dendritic Neuron Model for Fast and Accurate Data Classification |
URI | https://ieeexplore.ieee.org/document/9380661 https://www.ncbi.nlm.nih.gov/pubmed/33729951 https://www.proquest.com/docview/2708643582 https://www.proquest.com/docview/2502805694 |
Volume | 33 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VHlAvtFAegYKMxA28tR07sY8Vy6oguhe20opL5NjOhSpbtcmlv74e5yEVAeIQKVKcl75x_E3m8QF8kIr5RtiG-iI4Kq0pqOZMU114pWsurNRYKHyxLs4v5bet2u7Bp7kWJoSQks_CAndTLN_vXI-_yk5NruMKGX2dR9FxG2q15v8pLPLyIrFdwQtBRV5upxoZZk436_X3H9EbFHyBLa4iKTqAxzmGrIziD5akpLHyd7qZlp3VIVxMDzxkm_xa9F29cHe_9XL83zc6gicj_yRng8E8hb3QPoPDSduBjFP9GH4uR_EdurkJgX7FHKNI2e-CJ8vQ-iSQQFJnj5agntoVieyXrOxtR2zryZlzPTahIEvbWZKUNzEnKZnBc7hcfdl8PqejDgN1kpcdbXTuG229i1sunPSNzR2XTc4cRvEih2SsLix3pqxNhNgE5V1pRMNCbbQO-QvYb3dteAXESI8t55SNrEiqWhteB_yKFDUL0nKVAZ-gqNzYpBy1Mq6q5KwwUyUkK0SyGpHM4ON8zvXQouOfo48RhnnkiEAGJxPi1TiLbytRRodPYi1xBu_nw3H-YVDFtmHXxzEKg9OqMDKDl4OlzNeeDOz1n-_5Bg4EFlOkjLUT2O9u-vA2Upyufpds-x78VfO5 |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV3LbtQwFL0qRYJuKFAKgQJGghXy1HbsxF6wqBhGM3Q6G6bSiE1wbGdDlUGdjBD9Fn6Ff8N2HhII2FViESlSnJfvtX2u7-MAvOSC2IrpCtvMGcy1yrCkRGKZWSFLyjSXIVH4bJFNz_n7lVjtwPchF8Y5F4PP3CicRl--XZtt2Co7Vqn0KyTtQihP3bev3kDbvJmNvTRfMTZ5t3w7xR2HADac5g2uZGorqa3xR8oMt5VODeVVSkzwQHn8Q0iZaWpUXir_ecoJa3LFKuJKJaVL_XNvwE2PMwRrs8OGHRziLYEs4mtGM4ZZmq_6rByijpeLxfyDtz8ZHYWiWh6G7cGtNDjJlKC_LIKR1eXvADcudJN9-NF3URvf8nm0bcqRufqteuT_2od34U6HsNFJOyTuwY6r78N-z16BusnsAD6OO3ohvLx0Ds9CFJU3Sq6cRWNX20gBgWLtkhoFxrgL5PE9muhNg3Rt0Ykx21BmA411o1HkFg1RV1HRH8D5tfzhIezW69o9AqS4DUX1hPa4j4tSKlq6ME9mJXFcU5EA7UVfmK4Me2ADuSiiOUZUETWnCJpTdJqTwOvhni9tEZJ_tj4IYh9adhJP4KjXsKKbpzYFy71Jy0O2dAIvhst-hgluI1279da3EcH9LjLFE3jYaubw7F6hH__5nc_h9nR5Ni_ms8XpE9hjIXUkxucdwW5zuXVPPaBrymdxXCH4dN1K-BOoclLZ |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Decision-Tree-Initialized+Dendritic+Neuron+Model+for+Fast+and+Accurate+Data+Classification&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Luo%2C+Xudong&rft.au=Wen%2C+Xiaohao&rft.au=Zhou%2C+MengChu&rft.au=Abusorrah%2C+Abdullah&rft.date=2022-09-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=2162-237X&rft.eissn=2162-2388&rft.volume=33&rft.issue=9&rft.spage=4173&rft_id=info:doi/10.1109%2FTNNLS.2021.3055991&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon |