DGTR: Dynamic graph transformer for rumor detection

Social media rumors have the capacity to harm the public perception and the social progress. The news propagation pattern is a key clue for detecting rumors. Existing propagation-based rumor detection methods represent propagation patterns as a static graph structure. They simply consider the struct...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in research metrics and analytics Vol. 7; p. 1055348
Main Authors Wei, Siqi, Wu, Bin, Xiang, Aoxue, Zhu, Yangfu, Song, Chenguang
Format Journal Article
LanguageEnglish
Published Switzerland Frontiers Media S.A 11.01.2023
Subjects
Online AccessGet full text
ISSN2504-0537
2504-0537
DOI10.3389/frma.2022.1055348

Cover

More Information
Summary:Social media rumors have the capacity to harm the public perception and the social progress. The news propagation pattern is a key clue for detecting rumors. Existing propagation-based rumor detection methods represent propagation patterns as a static graph structure. They simply consider the structure information of news distribution in social networks and disregard the temporal information. The dynamic graph is an effective modeling tool for both the structural and temporal information involved in the process of news dissemination. Existing dynamic graph representation learning approaches struggle to capture the long-range dependence of the structure and temporal sequence as well as the rich semantic association between full graph features and individual parts. We build a transformer-based dynamic graph representation learning approach for rumor identification DGTR to address the aforementioned challenges. We design a position embedding format for the graph data such that the original transformer model can be utilized for learning dynamic graph representations. The model can describe the structural long-range reliance between the dynamic graph nodes and the temporal long-range dependence between the temporal snapshots by employing a self-attention mechanism. In addition, the CLS token in transformer may model the rich semantic relationships between the complete graph and each subpart. Extensive experiments demonstrate the superiority of our model when compared to the state of the art.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
Reviewed by: Fengming Liu, Shandong Normal University, China; Ning Ma, University of Chinese Academy of Sciences (CAS), China
This article was submitted to Text-mining and Literature-based Discovery, a section of the journal Frontiers in Research Metrics and Analytics
Edited by: Ying Lian, Communication University of China, China
ISSN:2504-0537
2504-0537
DOI:10.3389/frma.2022.1055348