Towards a Deep Attention-Based Sequential Recommender System

With the availability of a large amount of user-generated online data, discovering users' sequential behaviour has become an integral part of a Sequential Recommender System (SRS). Combining the recent observed items (i.e., short-term preferences) with prior interacted items (i.e., long-term pr...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 8; pp. 178073 - 178084
Main Authors Yakhchi, Shahpar, Beheshti, Amin, Ghafari, Seyed-Mohssen, Orgun, Mehmet A., Liu, Guanfeng
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:With the availability of a large amount of user-generated online data, discovering users' sequential behaviour has become an integral part of a Sequential Recommender System (SRS). Combining the recent observed items (i.e., short-term preferences) with prior interacted items (i.e., long-term preferences) has gained increasing attention in recent years. However, the existing methods mostly assume that all the adjacent items in a sequence are highly dependent, which may not be practical in real-world scenarios due to the uncertainty of customers' shopping behaviours. A user-item interaction sequence may contain some irrelevant items which may in turn lead to false dependencies between items. Moreover, current studies usually assign a static representation to each item when modeling a user's long-term preferences. Therefore, they cannot differentiate the contributions of the items. Specifically, these two types of users' preferences have been separately modeled and then linearly combined, which may fail to model complicated user-item interactions. In order to overcome the above mentioned problems, we propose a novel Deep Attention-based Sequential (DAS) model. DAS consists of three different blocks, <inline-formula> <tex-math notation="LaTeX">(i) </tex-math></inline-formula> an embedding block: which embeds users and items into low-dimensional spaces; <inline-formula> <tex-math notation="LaTeX">(ii) </tex-math></inline-formula> an attention block: which aims to discriminatively learn dependencies among items in both users' long-term and short-term item sets; and <inline-formula> <tex-math notation="LaTeX">(iii) </tex-math></inline-formula> a fully-connected block : which first learns a mixture of users' preferences representation through a nonlinear way and then combines it with users' embeddings to have a personalized recommendation. Extensive experiments demonstrate the superiority of our proposed model compared to the state-of-the-art approaches in SRSs.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3004656