A structured multi-head attention prediction method based on heterogeneous financial data

The diverse characteristics of heterogeneous data pose challenges in analyzing combined price and volume data. Therefore, appropriately handling heterogeneous financial data is crucial for accurate stock prediction. This article proposes a model that applies customized data processing methods tailor...

Full description

Saved in:
Bibliographic Details
Published inPeerJ. Computer science Vol. 9; p. e1653
Main Authors Zhao, Cheng, Li, Fangyong, Peng, Zhe, Zhou, Xiao, Zhuge, Yan
Format Journal Article
LanguageEnglish
Published United States PeerJ. Ltd 17.11.2023
PeerJ, Inc
PeerJ Inc
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The diverse characteristics of heterogeneous data pose challenges in analyzing combined price and volume data. Therefore, appropriately handling heterogeneous financial data is crucial for accurate stock prediction. This article proposes a model that applies customized data processing methods tailored to the characteristics of different types of heterogeneous financial data, enabling finer granularity and improved feature extraction. By utilizing the structured multi-head attention mechanism, the model captures the impact of heterogeneous financial data on stock price trends by extracting data information from technical, financial, and sentiment indicators separately. Experimental results conducted on four representative individual stocks in China's A-share market demonstrate the effectiveness of the proposed method. The model achieves an average MAPE of 1.378%, which is 0.429% lower than the benchmark algorithm. Moreover, the backtesting return rate exhibits an average increase of 28.56%. These results validate that the customized preprocessing method and structured multi-head attention mechanism can enhance prediction accuracy by attending to different types of heterogeneous data individually.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2376-5992
2376-5992
DOI:10.7717/peerj-cs.1653