MLSA4Rec: Mamba Combined with Low-Rank Decomposed Self-Attention for Sequential Recommendation
In applications such as e-commerce, online education, and streaming services, sequential recommendation systems play a critical role. Despite the excellent performance of self-attention-based sequential recommendation models in capturing dependencies between items in user interaction history, their...
Saved in:
Main Authors | , |
---|---|
Format | Journal Article |
Language | English |
Published |
17.07.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In applications such as e-commerce, online education, and streaming services,
sequential recommendation systems play a critical role. Despite the excellent
performance of self-attention-based sequential recommendation models in
capturing dependencies between items in user interaction history, their
quadratic complexity and lack of structural bias limit their applicability.
Recently, some works have replaced the self-attention module in sequential
recommenders with Mamba, which has linear complexity and structural bias.
However, these works have not noted the complementarity between the two
approaches. To address this issue, this paper proposes a new hybrid
recommendation framework, Mamba combined with Low-Rank decomposed
Self-Attention for Sequential Recommendation (MLSA4Rec), whose complexity is
linear with respect to the length of the user's historical interaction
sequence. Specifically, MLSA4Rec designs an efficient Mamba-LSA interaction
module. This module introduces a low-rank decomposed self-attention (LSA)
module with linear complexity and injects structural bias into it through
Mamba. The LSA module analyzes user preferences from a different perspective
and dynamically guides Mamba to focus on important information in user
historical interactions through a gated information transmission mechanism.
Finally, MLSA4Rec combines user preference information refined by the Mamba and
LSA modules to accurately predict the user's next possible interaction. To our
knowledge, this is the first study to combine Mamba and self-attention in
sequential recommendation systems. Experimental results show that MLSA4Rec
outperforms existing self-attention and Mamba-based sequential recommendation
models in recommendation accuracy on three real-world datasets, demonstrating
the great potential of Mamba and self-attention working together. |
---|---|
DOI: | 10.48550/arxiv.2407.13135 |