MSCL-Attention: A Multi-Scale Convolutional Long Short-Term Memory (LSTM) Attention Network for Predicting CO2 Emissions from Vehicles

The transportation industry is one of the major sources of energy consumption and CO2 emissions, and these emissions have been increasing year by year. Vehicle exhaust emissions have had serious impacts on air quality and global climate change, with CO2 emissions being one of the primary causes of g...

Full description

Saved in:
Bibliographic Details
Published inSustainability Vol. 16; no. 19; p. 8547
Main Authors Xie, Yi, Liu, Lizhuang, Han, Zhenqi, Zhang, Jialu
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.10.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The transportation industry is one of the major sources of energy consumption and CO2 emissions, and these emissions have been increasing year by year. Vehicle exhaust emissions have had serious impacts on air quality and global climate change, with CO2 emissions being one of the primary causes of global warming. In order to accurately predict the CO2 emission level of automobiles, an MSCL-Attention model based on a multi-scale convolutional neural network, long short-term memory network and multi-head self-attention mechanism is proposed in this study. By combining multi-scale feature extraction, temporal sequence dependency processing, and the self-attention mechanism, the model enhances the prediction accuracy and robustness. In our experiments, the MSCL-Attention model is benchmarked against the latest state-of-the-art models in the field. The results indicate that the MSCL-Attention model demonstrates superior performance in the task of CO2 emission prediction, surpassing the leading models currently available. This study provides a new method for predicting vehicle exhaust emissions, with significant application prospects, and is expected to contribute to reducing global vehicle emissions, improving air quality, and addressing climate change.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2071-1050
2071-1050
DOI:10.3390/su16198547