Gated Stacked Target-Related Autoencoder: A Novel Deep Feature Extraction and Layerwise Ensemble Method for Industrial Soft Sensor Application
These days, data-driven soft sensors have been widely applied to estimate the difficult-to-measure quality variables in the industrial process. How to extract effective feature representations from complex process data is still the difficult and hot spot in the soft sensing application field. Deep l...
Saved in:
Published in | IEEE transactions on cybernetics Vol. 52; no. 5; pp. 3457 - 3468 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.05.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | These days, data-driven soft sensors have been widely applied to estimate the difficult-to-measure quality variables in the industrial process. How to extract effective feature representations from complex process data is still the difficult and hot spot in the soft sensing application field. Deep learning (DL), which has made great progresses in many fields recently, has been used for process monitoring and quality prediction purposes for its outstanding nonlinear modeling and feature extraction abilities. In this work, deep stacked autoencoder (SAE) is introduced to construct a soft sensor model. Nevertheless, conventional SAE-based methods do not take information related to target values in the pretraining stage and just use the feature representations in the last hidden layer for final prediction. To this end, a novel gated stacked target-related autoencoder (GSTAE) is proposed for improving modeling performance in view of the above two issues. By adding prediction errors of target values into the loss function when executing a layerwise pretraining procedure, the target-related information is used to guide the feature learning process. Besides, gated neurons are utilized to control the information flow from different layers to the final output neuron that take full advantage of different levels of abstraction representations and quantify their contributions. Finally, the effectiveness and feasibility of the proposed approach are verified in two real industrial cases. |
---|---|
AbstractList | These days, data-driven soft sensors have been widely applied to estimate the difficult-to-measure quality variables in the industrial process. How to extract effective feature representations from complex process data is still the difficult and hot spot in the soft sensing application field. Deep learning (DL), which has made great progresses in many fields recently, has been used for process monitoring and quality prediction purposes for its outstanding nonlinear modeling and feature extraction abilities. In this work, deep stacked autoencoder (SAE) is introduced to construct a soft sensor model. Nevertheless, conventional SAE-based methods do not take information related to target values in the pretraining stage and just use the feature representations in the last hidden layer for final prediction. To this end, a novel gated stacked target-related autoencoder (GSTAE) is proposed for improving modeling performance in view of the above two issues. By adding prediction errors of target values into the loss function when executing a layerwise pretraining procedure, the target-related information is used to guide the feature learning process. Besides, gated neurons are utilized to control the information flow from different layers to the final output neuron that take full advantage of different levels of abstraction representations and quantify their contributions. Finally, the effectiveness and feasibility of the proposed approach are verified in two real industrial cases.These days, data-driven soft sensors have been widely applied to estimate the difficult-to-measure quality variables in the industrial process. How to extract effective feature representations from complex process data is still the difficult and hot spot in the soft sensing application field. Deep learning (DL), which has made great progresses in many fields recently, has been used for process monitoring and quality prediction purposes for its outstanding nonlinear modeling and feature extraction abilities. In this work, deep stacked autoencoder (SAE) is introduced to construct a soft sensor model. Nevertheless, conventional SAE-based methods do not take information related to target values in the pretraining stage and just use the feature representations in the last hidden layer for final prediction. To this end, a novel gated stacked target-related autoencoder (GSTAE) is proposed for improving modeling performance in view of the above two issues. By adding prediction errors of target values into the loss function when executing a layerwise pretraining procedure, the target-related information is used to guide the feature learning process. Besides, gated neurons are utilized to control the information flow from different layers to the final output neuron that take full advantage of different levels of abstraction representations and quantify their contributions. Finally, the effectiveness and feasibility of the proposed approach are verified in two real industrial cases. These days, data-driven soft sensors have been widely applied to estimate the difficult-to-measure quality variables in the industrial process. How to extract effective feature representations from complex process data is still the difficult and hot spot in the soft sensing application field. Deep learning (DL), which has made great progresses in many fields recently, has been used for process monitoring and quality prediction purposes for its outstanding nonlinear modeling and feature extraction abilities. In this work, deep stacked autoencoder (SAE) is introduced to construct a soft sensor model. Nevertheless, conventional SAE-based methods do not take information related to target values in the pretraining stage and just use the feature representations in the last hidden layer for final prediction. To this end, a novel gated stacked target-related autoencoder (GSTAE) is proposed for improving modeling performance in view of the above two issues. By adding prediction errors of target values into the loss function when executing a layerwise pretraining procedure, the target-related information is used to guide the feature learning process. Besides, gated neurons are utilized to control the information flow from different layers to the final output neuron that take full advantage of different levels of abstraction representations and quantify their contributions. Finally, the effectiveness and feasibility of the proposed approach are verified in two real industrial cases. |
Author | Sun, Qingqiang Ge, Zhiqiang |
Author_xml | – sequence: 1 givenname: Qingqiang orcidid: 0000-0002-7042-5640 surname: Sun fullname: Sun, Qingqiang email: sunqingqiang@zju.edu.cn organization: State Key Laboratory of Industrial Control Technology, Institute of Industrial Process Control, College of Control Science and Engineering, Zhejiang University, Hangzhou, China – sequence: 2 givenname: Zhiqiang orcidid: 0000-0002-2071-4380 surname: Ge fullname: Ge, Zhiqiang email: gezhiqiang@zju.edu.cn organization: State Key Laboratory of Industrial Control Technology, Institute of Industrial Process Control, College of Control Science and Engineering, Zhejiang University, Hangzhou, China |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/32833658$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kc1u1DAUhS1UREvpAyAkZIkNm0z9EzsOu2FoS6UpSMywYBU59g2kZOxgO0BfgmfG0xlm0QXeXOv6O9f2OU_RkfMOEHpOyYxSUp-vF1_ezhhhZMYJJZzTR-iEUakKxipxdNjL6hidxXhL8lK5Vasn6JgzxbkU6gT9udIJLF4lbb7nutbhK6TiEwz37fmUPDjjLYQ3eI4_-J8w4HcAI74EnaYA-OJ3Ctqk3jusncVLfQfhVx_zgYuwaQfAN5C-eYs7H_C1s1NModcDXvku4RW4mNvzcRx6o7dDnqHHnR4inO3rKfp8ebFevC-WH6-uF_NlYXhZp6K1jAtdd8YyxrSwugVTGasEq5hVNH-dmJIr1Upda6JNLTojKDGiKy3tQPJT9Ho3dwz-xwQxNZs-GhgG7cBPsWElryirpSAZffUAvfVTcPl1DZOyIlRmUzP1ck9N7QZsM4Z-o8Nd88_pDNAdYIKPMUB3QChptoE220CbbaDNPtCsqR5oTJ_ufcqm98N_lS92yh4ADjfVtCqlqPlfzDGtlg |
CODEN | ITCEB8 |
CitedBy_id | crossref_primary_10_1002_cjce_25483 crossref_primary_10_1177_01423312231182464 crossref_primary_10_1016_j_aei_2025_103172 crossref_primary_10_1109_TII_2023_3323675 crossref_primary_10_3390_act13010038 crossref_primary_10_1109_TSMC_2022_3198833 crossref_primary_10_1109_TCYB_2023_3295852 crossref_primary_10_1016_j_chemolab_2023_104934 crossref_primary_10_1109_JSEN_2024_3420124 crossref_primary_10_1109_TII_2023_3275700 crossref_primary_10_1109_TCYB_2023_3235155 crossref_primary_10_1016_j_conengprac_2024_105955 crossref_primary_10_1109_TIM_2022_3214611 crossref_primary_10_1016_j_precisioneng_2024_02_015 crossref_primary_10_1109_TCYB_2021_3109618 crossref_primary_10_1109_JSEN_2023_3346849 crossref_primary_10_1109_TII_2022_3227731 crossref_primary_10_1109_TIM_2023_3331407 crossref_primary_10_1002_cjce_25676 crossref_primary_10_1109_JSEN_2024_3522323 crossref_primary_10_1016_j_jprocont_2022_09_009 crossref_primary_10_1109_JSEN_2024_3351431 crossref_primary_10_1109_JSEN_2022_3199474 crossref_primary_10_1109_TII_2024_3463703 crossref_primary_10_1016_j_engappai_2022_105547 crossref_primary_10_1109_JSEN_2024_3388455 crossref_primary_10_1109_TSMC_2023_3322195 crossref_primary_10_1109_TII_2023_3257307 crossref_primary_10_1109_TII_2023_3316179 crossref_primary_10_1109_TIM_2024_3502784 crossref_primary_10_1016_j_compchemeng_2023_108324 crossref_primary_10_1002_cjce_25447 crossref_primary_10_1109_JSEN_2021_3096215 crossref_primary_10_1109_JSEN_2022_3219253 crossref_primary_10_1109_TII_2021_3053128 crossref_primary_10_1109_TII_2023_3330342 crossref_primary_10_3390_s24072073 crossref_primary_10_1109_TAI_2023_3240114 crossref_primary_10_1109_JSEN_2024_3367909 crossref_primary_10_1109_TNNLS_2023_3321691 crossref_primary_10_1007_s40815_023_01544_8 crossref_primary_10_1016_j_measurement_2023_113477 crossref_primary_10_1002_cem_3529 crossref_primary_10_1109_TKDE_2021_3137792 crossref_primary_10_1109_TIM_2022_3228278 crossref_primary_10_3390_en15155743 crossref_primary_10_1016_j_conengprac_2024_105934 crossref_primary_10_1002_cem_3605 crossref_primary_10_1088_1361_6501_ad7483 crossref_primary_10_1109_TCYB_2022_3178116 crossref_primary_10_1002_cjce_24808 crossref_primary_10_1109_TIE_2022_3227274 crossref_primary_10_1109_TII_2021_3127204 crossref_primary_10_1109_TCST_2023_3240980 crossref_primary_10_1016_j_jprocont_2024_103300 crossref_primary_10_1109_TASE_2023_3281336 crossref_primary_10_1002_cjce_24886 crossref_primary_10_1109_TII_2022_3213819 crossref_primary_10_1109_TII_2024_3476522 crossref_primary_10_1016_j_jprocont_2024_103301 crossref_primary_10_1109_TCYB_2025_3537809 crossref_primary_10_1007_s11517_024_03120_0 crossref_primary_10_1109_TII_2024_3431034 crossref_primary_10_1109_TCYB_2024_3431636 crossref_primary_10_1109_TII_2023_3268745 crossref_primary_10_1109_TSMC_2024_3493071 crossref_primary_10_1021_acsomega_4c01254 crossref_primary_10_1109_ACCESS_2024_3409899 crossref_primary_10_1109_JIOT_2023_3299201 crossref_primary_10_1088_1361_6501_ad6684 crossref_primary_10_1109_TCYB_2024_3365068 crossref_primary_10_1088_1361_6501_aceb82 crossref_primary_10_1109_TII_2021_3131471 crossref_primary_10_1007_s10489_025_06368_7 crossref_primary_10_1016_j_engappai_2024_108361 crossref_primary_10_1016_j_eswa_2024_124909 crossref_primary_10_1109_TII_2021_3130411 crossref_primary_10_1109_TII_2024_3465597 crossref_primary_10_1021_acsomega_2c01108 crossref_primary_10_1016_j_ins_2025_122036 crossref_primary_10_1016_j_asoc_2024_111977 crossref_primary_10_1088_1361_6501_ad66f7 crossref_primary_10_1109_TII_2022_3220857 crossref_primary_10_1109_TII_2022_3205356 crossref_primary_10_1007_s10845_023_02303_0 crossref_primary_10_1016_j_measurement_2023_113491 |
Cites_doi | 10.1109/TIE.2016.2622668 10.1016/j.ifacol.2016.03.036 10.21437/Interspeech.2014-80 10.1109/TII.2018.2809730 10.1016/j.chemolab.2018.07.002 10.1016/j.conengprac.2019.104198 10.1109/ACCESS.2017.2756872 10.5555/2999134.2999257 10.1016/j.bej.2018.04.015 10.1109/TCYB.2016.2536638 10.1109/TCYB.2016.2625419 10.1016/j.isatra.2019.07.001 10.1016/j.conengprac.2019.07.016 10.1109/TII.2016.2610839 10.1109/TII.2019.2902129 10.1016/j.compchemeng.2008.12.012 10.1016/j.jprocont.2013.05.007 10.1016/j.jprocont.2018.04.004 10.21437/Interspeech.2010-343 10.1016/j.jprocont.2014.01.012 10.1016/j.chemolab.2019.103813 10.1126/science.1127647 10.1038/nature14539 10.1016/j.chemolab.2019.103814 10.1109/TCYB.2014.2363492 10.1162/neco.2006.18.7.1527 10.1016/j.conengprac.2004.04.013 10.1109/TIE.2017.2733448 10.1109/TCYB.2019.2947622 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
DBID | 97E RIA RIE AAYXX CITATION NPM 7SC 7SP 7TB 8FD F28 FR3 H8D JQ2 L7M L~C L~D 7X8 |
DOI | 10.1109/TCYB.2020.3010331 |
DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Xplore CrossRef PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Mechanical & Transportation Engineering Abstracts Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitle | CrossRef PubMed Aerospace Database Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Engineering Research Database Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic PubMed Aerospace Database |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE/IET Electronic Library url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Sciences (General) |
EISSN | 2168-2275 |
EndPage | 3468 |
ExternalDocumentID | 32833658 10_1109_TCYB_2020_3010331 9174659 |
Genre | orig-research Journal Article |
GrantInformation_xml | – fundername: Natural Science Foundation of Zhejiang Province grantid: LR18F030001 funderid: 10.13039/501100004731 – fundername: National Natural Science Foundation of China grantid: 61722310 funderid: 10.13039/501100001809 |
GroupedDBID | 0R~ 4.4 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACIWK AENEX AGQYO AGSQL AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD HZ~ IFIPE IPLJI JAVBF M43 O9- OCL PQQKQ RIA RIE RNS AAYXX CITATION RIG NPM 7SC 7SP 7TB 8FD F28 FR3 H8D JQ2 L7M L~C L~D 7X8 |
ID | FETCH-LOGICAL-c349t-bd235a9fcd222a5dabec7cd85272d812160c4388b6a9a0ac95fc510c5f4d1fe63 |
IEDL.DBID | RIE |
ISSN | 2168-2267 2168-2275 |
IngestDate | Fri Jul 11 06:32:52 EDT 2025 Sun Jun 29 16:33:08 EDT 2025 Thu Jan 02 22:53:56 EST 2025 Tue Jul 01 00:53:56 EDT 2025 Thu Apr 24 23:03:48 EDT 2025 Wed Aug 27 02:37:56 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 5 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c349t-bd235a9fcd222a5dabec7cd85272d812160c4388b6a9a0ac95fc510c5f4d1fe63 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0002-2071-4380 0000-0002-7042-5640 |
PMID | 32833658 |
PQID | 2667016689 |
PQPubID | 85422 |
PageCount | 12 |
ParticipantIDs | ieee_primary_9174659 crossref_primary_10_1109_TCYB_2020_3010331 pubmed_primary_32833658 proquest_miscellaneous_2437129650 crossref_citationtrail_10_1109_TCYB_2020_3010331 proquest_journals_2667016689 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-05-01 |
PublicationDateYYYYMMDD | 2022-05-01 |
PublicationDate_xml | – month: 05 year: 2022 text: 2022-05-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: Piscataway |
PublicationTitle | IEEE transactions on cybernetics |
PublicationTitleAbbrev | TCYB |
PublicationTitleAlternate | IEEE Trans Cybern |
PublicationYear | 2022 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref12 ref15 ref14 Chung (ref29) 2014 ref31 ref30 ref11 ref10 ref32 ref2 ref1 ref17 ref16 ref19 ref18 Makhzani (ref23) 2014 ref26 ref25 ref20 ref22 ref21 Rasmus (ref24) 2015 ref28 ref27 ref8 ref7 ref9 ref4 ref3 ref6 ref5 |
References_xml | – ident: ref4 doi: 10.1109/TIE.2016.2622668 – ident: ref20 doi: 10.1016/j.ifacol.2016.03.036 – ident: ref28 doi: 10.21437/Interspeech.2014-80 – ident: ref25 doi: 10.1109/TII.2018.2809730 – ident: ref3 doi: 10.1016/j.chemolab.2018.07.002 – volume-title: Empirical evaluation of gated recurrent neural networks on sequence modeling year: 2014 ident: ref29 – volume-title: k-sparse autoencoders year: 2014 ident: ref23 – ident: ref27 doi: 10.1016/j.conengprac.2019.104198 – ident: ref8 doi: 10.1109/ACCESS.2017.2756872 – start-page: 3546 volume-title: Advances in Neural Information Processing Systems year: 2015 ident: ref24 article-title: Semi-supervised learning with ladder networks – ident: ref10 doi: 10.5555/2999134.2999257 – ident: ref7 doi: 10.1016/j.bej.2018.04.015 – ident: ref13 doi: 10.1109/TCYB.2016.2536638 – ident: ref9 doi: 10.1109/TCYB.2016.2625419 – ident: ref18 doi: 10.1016/j.isatra.2019.07.001 – ident: ref19 doi: 10.1016/j.conengprac.2019.07.016 – ident: ref32 doi: 10.1109/TII.2016.2610839 – ident: ref30 doi: 10.1109/TII.2019.2902129 – ident: ref5 doi: 10.1016/j.compchemeng.2008.12.012 – ident: ref2 doi: 10.1016/j.jprocont.2013.05.007 – ident: ref6 doi: 10.1016/j.jprocont.2018.04.004 – ident: ref11 doi: 10.21437/Interspeech.2010-343 – ident: ref21 doi: 10.1016/j.jprocont.2014.01.012 – ident: ref1 doi: 10.1016/j.chemolab.2019.103813 – ident: ref12 doi: 10.1126/science.1127647 – ident: ref14 doi: 10.1038/nature14539 – ident: ref26 doi: 10.1016/j.chemolab.2019.103814 – ident: ref15 doi: 10.1109/TCYB.2014.2363492 – ident: ref16 doi: 10.1162/neco.2006.18.7.1527 – ident: ref31 doi: 10.1016/j.conengprac.2004.04.013 – ident: ref22 doi: 10.1109/TIE.2017.2733448 – ident: ref17 doi: 10.1109/TCYB.2019.2947622 |
SSID | ssj0000816898 |
Score | 2.5968919 |
Snippet | These days, data-driven soft sensors have been widely applied to estimate the difficult-to-measure quality variables in the industrial process. How to extract... |
SourceID | proquest pubmed crossref ieee |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 3457 |
SubjectTerms | Computational modeling Data mining Data models Deep learning Deep learning (DL) Feature extraction gated neurons Information flow Logic gates Machine learning Modelling Neurons nonlinear feature extraction Process control Representations soft sensor stacked autoencoder (SAE) target-related information |
Title | Gated Stacked Target-Related Autoencoder: A Novel Deep Feature Extraction and Layerwise Ensemble Method for Industrial Soft Sensor Application |
URI | https://ieeexplore.ieee.org/document/9174659 https://www.ncbi.nlm.nih.gov/pubmed/32833658 https://www.proquest.com/docview/2667016689 https://www.proquest.com/docview/2437129650 |
Volume | 52 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB61PXGBPniElspIHACRbRI7D3PbllYVYnvpViqnyLHHEmJJqt0EED-C38zYyaYVAsQtiZ2HNTPxN_bMfAAvEoGZtWkVxkZoclA4hiqWKpTaSqlVoax0-c6zi-z8Sry_Tq834M2YC4OIPvgMJ-7Q7-WbRnduqeyIXAuRpXITNslx63O1xvUUTyDhqW8TOggJVeTDJmYcyaP5ycdjcgYT8lEdsQF3BDGcZlaeOa73OzOSp1j5O9r0s87ZA5itv7cPNvk86dpqon_8Vsrxfwe0DfcH-Mmmvb7swAbWu7AzGPiKvRyqUL_ag59uXc0wAqNk54bNfcR46GPn6HTatY0rgWlw-ZZN2UXzFRfsHeINc5iyWyI7_d4u-6QJpmrDPijC9t8-raihXuGXaoFs5smrGaFmdksgwi5pXmCX5FzT5ent7vpDuDo7nZ-chwN5Q6i5kG1YmYSnSlptCIGo1ChSllybIk3yxBCqiLNIC14UVaakipSWqdX0f9CpFSa2mPFHsFU3NT4BpqPE8lxZIQsuElMpibkqCFrmFZI7FAUQrQVY6qGyuSPYWJTew4lk6cRfOvGXg_gDeD3ectOX9fhX5z0nurHjILUADtZaUg6GvyoJ7-SEokkVA3g-NpPJun0YVWPTUR_Bc4JZNIAAHvfaNT57rZRP__zOfbiXuPwLH3F5AFvtssNnhIra6tCbwy-FVgc1 |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VcoALUAolpYCROAAi2yTOy9yW0mqB3b10K5VT5NhjCbFNqt0EED-C38zYyaYIAeKWh_PSzGS-sWfmA3gWxZgak5R-qGNFAQpHX4ZC-kIZIZTMpRG23nk2Tydn8fvz5HwLXg21MIjoks9wZDfdWr6uVWunyg4ptIjTRFyD6-T3k7Cr1hpmVByFhCO_jWjDJ1yR9cuYYSAOF0cf31A4GFGUaqkNuKWI4eRbeWrZ3n_xSY5k5e940_mdk9sw27xxl27yedQ25Uh9_62Z4_9-0h241QNQNu40Zge2sLoLO72Jr9nzvg_1i134YWfWNCM4Spau2cLljPsue452x21T2yaYGlev2ZjN6y-4ZG8RL5lFle0K2fG3ZtWVTTBZaTaVhO6_flrTiWqNF-US2czRVzPCzeyKQoSdkmdgpxRe0-Hx1fr6PTg7OV4cTfyevsFXPBaNX-qIJ1IYpQmDyERLUpdM6TyJskgTrgjTQMU8z8tUChlIJRKj6A-hEhPr0GDK78N2VVf4AJgKIsMzaWKR8zjSpRSYyZzAZVYiBUSBB8FGgIXqe5tbio1l4WKcQBRW_IUVf9GL34OXwyWXXWOPfw3etaIbBvZS8-BgoyVFb_rrghBPRjiaVNGDp8NpMlq7EiMrrFsaE_OMgBZ9gAd7nXYN994o5f6fn_kEbkwWs2kxfTf_8BBuRrYaw-VfHsB2s2rxEWGkpnzsTOMnYZsKfg |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Gated+Stacked+Target-Related+Autoencoder%3A+A+Novel+Deep+Feature+Extraction+and+Layerwise+Ensemble+Method+for+Industrial+Soft+Sensor+Application&rft.jtitle=IEEE+transactions+on+cybernetics&rft.au=Sun%2C+Qingqiang&rft.au=Ge%2C+Zhiqiang&rft.date=2022-05-01&rft.issn=2168-2275&rft.eissn=2168-2275&rft.volume=52&rft.issue=5&rft.spage=3457&rft_id=info:doi/10.1109%2FTCYB.2020.3010331&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2168-2267&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2168-2267&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2168-2267&client=summon |