Explainable Deep Learning Model for Carbon Dioxide Estimation

In recent years, environmental sustainability and the reduction of CO2 emissions have become significant research topics. To effectively reduce CO2 emissions, recent studies have used deep learning models to provide precise estimates, but these models often lack interpretability. In light of this, o...

Full description

Saved in:
Bibliographic Details
Published inIEICE Transactions on Information and Systems Vol. E108.D; no. 9; pp. 1138 - 1141
Main Authors LEE, Chong-Hui, HUANG, Lin-Hao, QI, Fang-Bin, WANG, Wei-Juan, ZHANG, Xian-Ji, LI, Zhen
Format Journal Article
LanguageEnglish
Published The Institute of Electronics, Information and Communication Engineers 01.09.2025
一般社団法人 電子情報通信学会
Subjects
Online AccessGet full text
ISSN0916-8532
1745-1361
DOI10.1587/transinf.2024EDL8087

Cover

Abstract In recent years, environmental sustainability and the reduction of CO2 emissions have become significant research topics. To effectively reduce CO2 emissions, recent studies have used deep learning models to provide precise estimates, but these models often lack interpretability. In light of this, our study employs an explainable neural network to learn fuel consumption, which is then converted to CO2 emissions. The explainable neural network includes an explainable layer that can explain the importance of each input variable. Through this layer, the study can elucidate the impact of different speeds on fuel consumption and CO2 emissions. Validated with real fleet data, our study demonstrates an impressive mean absolute percentage error (MAPE) of only 3.3%, outperforming recent research methods.
AbstractList In recent years, environmental sustainability and the reduction of CO2 emissions have become significant research topics. To effectively reduce CO2 emissions, recent studies have used deep learning models to provide precise estimates, but these models often lack interpretability. In light of this, our study employs an explainable neural network to learn fuel consumption, which is then converted to CO2 emissions. The explainable neural network includes an explainable layer that can explain the importance of each input variable. Through this layer, the study can elucidate the impact of different speeds on fuel consumption and CO2 emissions. Validated with real fleet data, our study demonstrates an impressive mean absolute percentage error (MAPE) of only 3.3%, outperforming recent research methods.
ArticleNumber 2024EDL8087
Author Xian-Ji ZHANG
Lin-Hao HUANG
Wei-Juan WANG
Chong-Hui LEE
Zhen LI
Fang-Bin QI
Author_xml – sequence: 1
  givenname: Chong-Hui
  surname: LEE
  fullname: LEE, Chong-Hui
– sequence: 2
  givenname: Lin-Hao
  surname: HUANG
  fullname: HUANG, Lin-Hao
– sequence: 3
  givenname: Fang-Bin
  surname: QI
  fullname: QI, Fang-Bin
– sequence: 4
  givenname: Wei-Juan
  surname: WANG
  fullname: WANG, Wei-Juan
– sequence: 5
  givenname: Xian-Ji
  surname: ZHANG
  fullname: ZHANG, Xian-Ji
– sequence: 6
  givenname: Zhen
  surname: LI
  fullname: LI, Zhen
BackLink https://cir.nii.ac.jp/crid/1390584870603025536$$DView record in CiNii
BookMark eNpNUDtPwzAQtlCRaAv_gCEDa4qfsT0woCY8pCAkBLPlOpfiKjiVnaH8e1KVFpa70-l76ZuhSegDIHRN8IIIJW-HaEPyoV1QTHlV1goreYamRHKRE1aQCZpiTYpcCUYv0CylDcZEUSKm6K7abTvrg111kJUA26wGG4MP6-ylb6DL2j5mSxtXfchK3-98A1mVBv9lB9-HS3Te2i7B1e-eo4-H6n35lNevj8_L-zp3TGOZUwKSa8Ycs4B1o4hrHWWt07oQtMUwPhlvuG6ckJpqTIXgtpCasBVvrOBsjvhB18U-pQit2cYxQvw2BJt9BeZYgflXwUi7OdCC98b5_SRjIKG4krjAbG_EihH2doBt0mDXcNK2cfCugz_timBlSqOPxz-vE9h92mggsB9DkHk7
ContentType Journal Article
Copyright 2025 The Institute of Electronics, Information and Communication Engineers
Copyright_xml – notice: 2025 The Institute of Electronics, Information and Communication Engineers
DBID RYH
AAYXX
CITATION
DOI 10.1587/transinf.2024EDL8087
DatabaseName CiNii Complete
CrossRef
DatabaseTitle CrossRef
DatabaseTitleList

DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISSN 1745-1361
EndPage 1141
ExternalDocumentID 10_1587_transinf_2024EDL8087
article_transinf_E108_D_9_E108_D_2024EDL8087_article_char_en
GroupedDBID -~X
5GY
ABJNI
ABZEH
ACGFS
ADNWM
AENEX
ALMA_UNASSIGNED_HOLDINGS
CS3
DU5
EBS
EJD
F5P
ICE
JSF
JSH
KQ8
OK1
P2P
RJT
RZJ
TN5
ZKX
1TH
AFFNX
C1A
CKLRP
H13
RIG
RYH
RYL
VOH
ZE2
ZY4
AAYXX
CITATION
ID FETCH-LOGICAL-c3907-21e74933c3ae09d81cfc23fc99652f0ee0934d49dc5792902554a67913b4da543
ISSN 0916-8532
IngestDate Wed Sep 03 16:41:09 EDT 2025
Fri Jun 27 00:52:12 EDT 2025
Mon Sep 01 00:08:24 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 9
Language English
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-c3907-21e74933c3ae09d81cfc23fc99652f0ee0934d49dc5792902554a67913b4da543
OpenAccessLink https://www.jstage.jst.go.jp/article/transinf/E108.D/9/E108.D_2024EDL8087/_article/-char/en
PageCount 4
ParticipantIDs crossref_primary_10_1587_transinf_2024EDL8087
nii_cinii_1390584870603025536
jstage_primary_article_transinf_E108_D_9_E108_D_2024EDL8087_article_char_en
PublicationCentury 2000
PublicationDate 2025-09-01
PublicationDateYYYYMMDD 2025-09-01
PublicationDate_xml – month: 09
  year: 2025
  text: 2025-09-01
  day: 01
PublicationDecade 2020
PublicationTitle IEICE Transactions on Information and Systems
PublicationTitleAlternate IEICE Trans. Inf. & Syst.
PublicationTitle_FL IEICE Trans. Inf. & Syst
PublicationYear 2025
Publisher The Institute of Electronics, Information and Communication Engineers
一般社団法人 電子情報通信学会
Publisher_xml – name: The Institute of Electronics, Information and Communication Engineers
– name: 一般社団法人 電子情報通信学会
References [20] A.N.T. Kissiedu, G.K. Aggrey, M.G. Asante-Mensah, and A. Asante, “Development of Pneumonia Identification System: A Comparative Analysis of Some Selected CNN Architectures Using Adam, Nadam, and RAdam Optimizers,” 2024 IEEE SmartBlock4Africa, Accra, Ghana, pp.1-12, 2024. doi: 10.1109/SmartBlock4Africa61928.2024 10.1109/SmartBlock4Africa61928.2024
[18] X. Yang, X. Zheng, and H. Gao, “SGD-Based Adaptive NN Control Design for Uncertain Nonlinear Systems,” IEEE Trans. Neural Netw. Learn. Syst., vol.29, no.10, pp.5071-5083, Oct. 2018. doi: 10.1109/TNNLS.2018.2790479 10.1109/TNNLS.2018.2790479
[17] R. Chakraborty and Y. Hasija, “Predicting MicroRNA Sequence Using CNN and LSTM Stacked in Seq2Seq Architecture,” IEEE/ACM Trans. Comput. Biol. Bioinf., vol.17, no.6, pp.2183-2188, 1 Nov.-Dec. 2020. doi: 10.1109/TCBB.2019.2936186 10.1109/TCBB.2019.2936186
[14] Y. Lou, R. Wu, J. Li, L. Wang, X. Li, and G. Chen, “A Learning Convolutional Neural Network Approach for Network Robustness Prediction,” IEEE Trans. Cybern., vol.53, no.7, pp.4531-4544, July 2023. doi: 10.1109/TCYB.2022.3207878 10.1109/TCYB.2022.3207878
[13] J. Liu and D. Zhou, “Minimum Functional Length Analysis of K-Mer Based on BPNN,” IEEE/ACM Trans. Comput. Biol. Bioinf., vol.19, no.5, pp.2920-2925, 1 Sept.-Oct. 2022. doi: 10.1109/TCBB.2021.3098512 10.1109/TCBB.2021.3098512
[11] C.-L. Lo, C.-H. Chen, T.-S. Kuan, K.-R. Lo, and H.-J. Cho, “Fuel Consumption Estimation System and Method with Lower Cost,” Symmetry, vol.9, no.7, article no.105, 2017. doi: 10.3390/sym9070105 10.3390/sym9070105
[8] C. Cervellera and D. Macciò, “Local Linear Regression for Function Learning: An Analysis Based on Sample Discrepancy,” IEEE Trans. Neural Netw. Learn. Syst., vol.25, no.11, pp.2086-2098, Nov. 2014. doi: 10.1109/TNNLS.2014.2305193 10.1109/TNNLS.2014.2305193
[15] Y. Qin, H. Fu, F. Xu, and Y. Jin, “EMWP-RNN: A Physics-Encoded Recurrent Neural Network for Wave Propagation in Plasmas,” IEEE Antennas Wireless Propag. Lett., vol.23, no.1, pp.219-223, Jan. 2024. doi: 10.1109/LAWP.2023.3321914 10.1109/LAWP.2023.3321914
[19] C. Chen, L. Shen, W. Liu, and Z.-Q. Luo, “Efficient-Adam: Communication-Efficient Distributed Adam,” IEEE Trans. Signal Process., vol.71, pp.3257-3266, 2023. doi: 10.1109/TSP.2023.3309461 10.1109/TSP.2023.3309461
[9] H. Fang, C. Shi, and C.-H. Chen, “BioExpDNN: Bioinformatic Explainable Deep Neural Network,” 2020 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Seoul, Korea (South), pp.2461-2467, 2020. doi: 10.1109/BIBM49941.2020.9313113 10.1109/BIBM49941.2020.9313113
[3] A. Sayeed, Y. Choi, J. Jung, Y. Lops, E. Eslami, and A.K. Salman, “A Deep Convolutional Neural Network Model for Improving WRF Simulations,” IEEE Trans. Neural Netw. Learn. Syst., vol.34, no.2, pp.750-760, Feb. 2023. doi: 10.1109/TNNLS.2021.3100902 10.1109/TNNLS.2021.3100902
[21] C. Li, Q. Zhong, and B. Li, “Clustering-Based Neural Network for Carbon Dioxide Estimation,” IEICE Trans. Inf. & Syst., vol.E106-D, no.5, pp.829-832, May 2023. doi: 10.1587/transinf.2022DLL0012 10.1587/transinf.2022DLL0012
[2] T.-H. Tsai and D.-B. Lin, “An On-Chip Fully Connected Neural Network Training Hardware Accelerator Based on Brain Float Point and Sparsity Awareness,” IEEE Open Journal of Circuits and Systems, vol.4, pp.85-98, 2023. doi: 10.1109/OJCAS.2023.3245061 10.1109/OJCAS.2023.3245061
[4] S.-H. Lee and H.-C. Ku, “A Dual Attention-Based Recurrent Neural Network for Short-Term Bike Sharing Usage Demand Prediction,” IEEE Trans. Intell. Transp. Syst., vol.24, no.4, pp.4621-4630, April 2023. doi: 10.1109/TITS.2022.3208087 10.1109/TITS.2022.3208087
[5] K. Han, Y. Wang, H. Chen, X. Chen, J. Guo, Z. Liu, Y. Tang, A. Xiao, C. Xu, Y. Xu, Z. Yang, Y. Zhang, and D. Tao, “A Survey on Vision Transformer,” IEEE Trans. Pattern Anal. Mach. Intell., vol.45, no.1, pp.87-110, 1 Jan. 2023. doi: 10.1109/TPAMI.2022.3152247 10.1109/TPAMI.2022.3152247
[6] W. Citko and W. Sienko, “Image Recognition and Reconstruction With Machine Learning: An Inverse Problem Approach,” IEEE Access, vol.11, pp.107463-107471, 2023. doi: 10.1109/ACCESS.2023.3315831 10.1109/ACCESS.2023.3315831
[16] A. Faraji, S.A. Sadrossadat, W. Na, F. Feng, and Q.-J. Zhang, “A New Macromodeling Method Based on Deep Gated Recurrent Unit Regularized With Gaussian Dropout for Nonlinear Circuits,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol.70, no.7, pp.2904-2915, July 2023. doi: 10.1109/TCSI.2023.3264616 10.1109/TCSI.2023.3264616
[7] D.W. Otter, J.R. Medina, and J.K. Kalita, “A Survey of the Usages of Deep Learning for Natural Language Processing,” IEEE Trans. Neural Netw. Learn. Syst., vol.32, no.2, pp.604-624, Feb. 2021. doi: 10.1109/TNNLS.2020.2979670 10.1109/TNNLS.2020.2979670
[10] C.-H. Chen, “ExpDNN: Explainable Deep Neural Network,” arXiv, arXiv:2005.03461, pp.1-2, 2020. doi: 10.48550/arXiv.2005.03461 10.48550/arXiv.2005.03461
[22] C.-H. Chen, “Fuel Consumption Estimation Method Based on Clustering-based Deep Learning Model,” Asia-Pacific Journal of Clinical Oncology, vol.18, no.S2, pp.129-130, Aug. 2022. doi: 10.1111/ajco.13830 10.1111/ajco.13830
[1] Y. Tang, C. Zhao, J. Wang, C. Zhang, Q. Sun, W.X. Zheng, W. Du, F. Qian, and J. Kurths, “Perception and Navigation in Autonomous Systems in the Era of Learning: A Survey,” IEEE Trans. Neural Netw. Learn. Syst., vol.34, no.12, pp.9604-9624, Dec. 2023. doi: 10.1109/TNNLS.2022.3167688 10.1109/TNNLS.2022.3167688
[12] L. Zhao, W. Pu, R. Zhou, and Q. Shi, “A Third-Order Majorization Algorithm for Logistic Regression With Convergence Rate Guarantees,” IEEE Signal Process. Lett., vol.31, pp.1700-1704, 2024. doi: 10.1109/LSP.2024.3413306 10.1109/LSP.2024.3413306
References_xml – reference: [14] Y. Lou, R. Wu, J. Li, L. Wang, X. Li, and G. Chen, “A Learning Convolutional Neural Network Approach for Network Robustness Prediction,” IEEE Trans. Cybern., vol.53, no.7, pp.4531-4544, July 2023. doi: 10.1109/TCYB.2022.3207878 10.1109/TCYB.2022.3207878
– reference: [11] C.-L. Lo, C.-H. Chen, T.-S. Kuan, K.-R. Lo, and H.-J. Cho, “Fuel Consumption Estimation System and Method with Lower Cost,” Symmetry, vol.9, no.7, article no.105, 2017. doi: 10.3390/sym9070105 10.3390/sym9070105
– reference: [12] L. Zhao, W. Pu, R. Zhou, and Q. Shi, “A Third-Order Majorization Algorithm for Logistic Regression With Convergence Rate Guarantees,” IEEE Signal Process. Lett., vol.31, pp.1700-1704, 2024. doi: 10.1109/LSP.2024.3413306 10.1109/LSP.2024.3413306
– reference: [7] D.W. Otter, J.R. Medina, and J.K. Kalita, “A Survey of the Usages of Deep Learning for Natural Language Processing,” IEEE Trans. Neural Netw. Learn. Syst., vol.32, no.2, pp.604-624, Feb. 2021. doi: 10.1109/TNNLS.2020.2979670 10.1109/TNNLS.2020.2979670
– reference: [5] K. Han, Y. Wang, H. Chen, X. Chen, J. Guo, Z. Liu, Y. Tang, A. Xiao, C. Xu, Y. Xu, Z. Yang, Y. Zhang, and D. Tao, “A Survey on Vision Transformer,” IEEE Trans. Pattern Anal. Mach. Intell., vol.45, no.1, pp.87-110, 1 Jan. 2023. doi: 10.1109/TPAMI.2022.3152247 10.1109/TPAMI.2022.3152247
– reference: [16] A. Faraji, S.A. Sadrossadat, W. Na, F. Feng, and Q.-J. Zhang, “A New Macromodeling Method Based on Deep Gated Recurrent Unit Regularized With Gaussian Dropout for Nonlinear Circuits,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol.70, no.7, pp.2904-2915, July 2023. doi: 10.1109/TCSI.2023.3264616 10.1109/TCSI.2023.3264616
– reference: [21] C. Li, Q. Zhong, and B. Li, “Clustering-Based Neural Network for Carbon Dioxide Estimation,” IEICE Trans. Inf. & Syst., vol.E106-D, no.5, pp.829-832, May 2023. doi: 10.1587/transinf.2022DLL0012 10.1587/transinf.2022DLL0012
– reference: [10] C.-H. Chen, “ExpDNN: Explainable Deep Neural Network,” arXiv, arXiv:2005.03461, pp.1-2, 2020. doi: 10.48550/arXiv.2005.03461 10.48550/arXiv.2005.03461
– reference: [4] S.-H. Lee and H.-C. Ku, “A Dual Attention-Based Recurrent Neural Network for Short-Term Bike Sharing Usage Demand Prediction,” IEEE Trans. Intell. Transp. Syst., vol.24, no.4, pp.4621-4630, April 2023. doi: 10.1109/TITS.2022.3208087 10.1109/TITS.2022.3208087
– reference: [2] T.-H. Tsai and D.-B. Lin, “An On-Chip Fully Connected Neural Network Training Hardware Accelerator Based on Brain Float Point and Sparsity Awareness,” IEEE Open Journal of Circuits and Systems, vol.4, pp.85-98, 2023. doi: 10.1109/OJCAS.2023.3245061 10.1109/OJCAS.2023.3245061
– reference: [3] A. Sayeed, Y. Choi, J. Jung, Y. Lops, E. Eslami, and A.K. Salman, “A Deep Convolutional Neural Network Model for Improving WRF Simulations,” IEEE Trans. Neural Netw. Learn. Syst., vol.34, no.2, pp.750-760, Feb. 2023. doi: 10.1109/TNNLS.2021.3100902 10.1109/TNNLS.2021.3100902
– reference: [6] W. Citko and W. Sienko, “Image Recognition and Reconstruction With Machine Learning: An Inverse Problem Approach,” IEEE Access, vol.11, pp.107463-107471, 2023. doi: 10.1109/ACCESS.2023.3315831 10.1109/ACCESS.2023.3315831
– reference: [9] H. Fang, C. Shi, and C.-H. Chen, “BioExpDNN: Bioinformatic Explainable Deep Neural Network,” 2020 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Seoul, Korea (South), pp.2461-2467, 2020. doi: 10.1109/BIBM49941.2020.9313113 10.1109/BIBM49941.2020.9313113
– reference: [20] A.N.T. Kissiedu, G.K. Aggrey, M.G. Asante-Mensah, and A. Asante, “Development of Pneumonia Identification System: A Comparative Analysis of Some Selected CNN Architectures Using Adam, Nadam, and RAdam Optimizers,” 2024 IEEE SmartBlock4Africa, Accra, Ghana, pp.1-12, 2024. doi: 10.1109/SmartBlock4Africa61928.2024 10.1109/SmartBlock4Africa61928.2024
– reference: [15] Y. Qin, H. Fu, F. Xu, and Y. Jin, “EMWP-RNN: A Physics-Encoded Recurrent Neural Network for Wave Propagation in Plasmas,” IEEE Antennas Wireless Propag. Lett., vol.23, no.1, pp.219-223, Jan. 2024. doi: 10.1109/LAWP.2023.3321914 10.1109/LAWP.2023.3321914
– reference: [17] R. Chakraborty and Y. Hasija, “Predicting MicroRNA Sequence Using CNN and LSTM Stacked in Seq2Seq Architecture,” IEEE/ACM Trans. Comput. Biol. Bioinf., vol.17, no.6, pp.2183-2188, 1 Nov.-Dec. 2020. doi: 10.1109/TCBB.2019.2936186 10.1109/TCBB.2019.2936186
– reference: [19] C. Chen, L. Shen, W. Liu, and Z.-Q. Luo, “Efficient-Adam: Communication-Efficient Distributed Adam,” IEEE Trans. Signal Process., vol.71, pp.3257-3266, 2023. doi: 10.1109/TSP.2023.3309461 10.1109/TSP.2023.3309461
– reference: [18] X. Yang, X. Zheng, and H. Gao, “SGD-Based Adaptive NN Control Design for Uncertain Nonlinear Systems,” IEEE Trans. Neural Netw. Learn. Syst., vol.29, no.10, pp.5071-5083, Oct. 2018. doi: 10.1109/TNNLS.2018.2790479 10.1109/TNNLS.2018.2790479
– reference: [1] Y. Tang, C. Zhao, J. Wang, C. Zhang, Q. Sun, W.X. Zheng, W. Du, F. Qian, and J. Kurths, “Perception and Navigation in Autonomous Systems in the Era of Learning: A Survey,” IEEE Trans. Neural Netw. Learn. Syst., vol.34, no.12, pp.9604-9624, Dec. 2023. doi: 10.1109/TNNLS.2022.3167688 10.1109/TNNLS.2022.3167688
– reference: [8] C. Cervellera and D. Macciò, “Local Linear Regression for Function Learning: An Analysis Based on Sample Discrepancy,” IEEE Trans. Neural Netw. Learn. Syst., vol.25, no.11, pp.2086-2098, Nov. 2014. doi: 10.1109/TNNLS.2014.2305193 10.1109/TNNLS.2014.2305193
– reference: [22] C.-H. Chen, “Fuel Consumption Estimation Method Based on Clustering-based Deep Learning Model,” Asia-Pacific Journal of Clinical Oncology, vol.18, no.S2, pp.129-130, Aug. 2022. doi: 10.1111/ajco.13830 10.1111/ajco.13830
– reference: [13] J. Liu and D. Zhou, “Minimum Functional Length Analysis of K-Mer Based on BPNN,” IEEE/ACM Trans. Comput. Biol. Bioinf., vol.19, no.5, pp.2920-2925, 1 Sept.-Oct. 2022. doi: 10.1109/TCBB.2021.3098512 10.1109/TCBB.2021.3098512
SSID ssj0018215
Score 2.3883
Snippet In recent years, environmental sustainability and the reduction of CO2 emissions have become significant research topics. To effectively reduce CO2 emissions,...
SourceID crossref
nii
jstage
SourceType Index Database
Publisher
StartPage 1138
SubjectTerms Carbon dioxide estimation
deep learning
explainable neural network
fuel consumption
Title Explainable Deep Learning Model for Carbon Dioxide Estimation
URI https://www.jstage.jst.go.jp/article/transinf/E108.D/9/E108.D_2024EDL8087/_article/-char/en
https://cir.nii.ac.jp/crid/1390584870603025536
Volume E108.D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
ispartofPNX IEICE Transactions on Information and Systems, 2025/09/01, Vol.E108.D(9), pp.1138-1141
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lb9QwELbawgEOPAqIBYpywKeVSxI7cXxMdr3atoCE1Kq9RVnHaYMgW6FdhPgr_FnGdl5IK_GQVlY0niSbmc_O2JkHQm-oCjT1dUUKrhLCVGXLvGii4WWk_bKgypbpfP8hXl6w06voam_v58hrabtZHasfO-NK_kerQAO9mijZf9Bsf1EgwDHoF1rQMLR_pWPjQNdFP821vu2ypV7bEmc2MtF4dKxAw_N6_b0u9VTCkP4yaKM1S0_kyUy6ROcu0MF-RGhDlTadx_I4u7nz4ZnObtbNNVlu6wEehQ2basiyWPe7qvV0AWSS1T0ULw3bpa7J6bbFZ7v1EEa9b5UDC5YMZ4nxyJDQJjidYclxCkSJZYRFhlOKZYwzikVkmVP4TbEUpi_LDFM6x8I3TAnFSWQpPs4Cw2M8PVJ72gKnQcucxpYyg67xXmYQE7A73LSu3UzOWUQC6jK9d1N9UX6DP77zzRGZvZfFxgga6MfwuEyXnxM_4fvoTsi59QE4-zh8okpCVx6ju3cblwnXebvrKr_ZPXc_gelvcjrsN3U9smfOH6EH7ULESx2qHqM93Ryih12RD6-d8w_R_VHGyidoDDnPQM7rIOdZyHmAGM9Bzmsh5w2Qe4ouFvJ8tiRtBQ6iqPA5CQPNmaBU0UL7okwCVamQVgoWyVFY-RqIlJVMlCriYGeb9SkrYi4CumJlETH6DB0060Y_R14RxqFWhY6TuGKCm6JPQakErOe5HxbcnyDSySe_dYlWcrNABXnmnTxzI085f2fkOUFnTog9dzsMB24Z-Ek-z0V3MDq7ZzaxjTCVTNARaCJXtWlhUeSDXW48AOAVCM9E4xd_6H-J7g0D5BU62Hzd6iOwVjer1xY1vwC7p4J1
linkProvider Colorado Alliance of Research Libraries
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Explainable+Deep+Learning+Model+for+Carbon+Dioxide+Estimation&rft.jtitle=IEICE+Transactions+on+Information+and+Systems&rft.au=Lee+Chong-Hui&rft.au=Huang+Lin-Hao&rft.au=Qi+Fang-Bin&rft.au=Wang+Wei-Juan&rft.date=2025-09-01&rft.pub=%E4%B8%80%E8%88%AC%E7%A4%BE%E5%9B%A3%E6%B3%95%E4%BA%BA+%E9%9B%BB%E5%AD%90%E6%83%85%E5%A0%B1%E9%80%9A%E4%BF%A1%E5%AD%A6%E4%BC%9A&rft.issn=0916-8532&rft.eissn=1745-1361&rft.volume=advpub&rft_id=info:doi/10.1587%2Ftransinf.2024edl8087
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0916-8532&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0916-8532&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0916-8532&client=summon