A Collaborative Neural Model for Rating Prediction by Leveraging User Reviews and Product Images
Product images and user reviews are two types of important side information to improve recommender systems. Product images capture users’ appearance preference, while user reviews reflect customers’ opinions on product properties that might not be directly visible. They can complement each other to...
Saved in:
Published in | Information Retrieval Technology pp. 99 - 111 |
---|---|
Main Authors | , , , , |
Format | Book Chapter |
Language | English |
Published |
Cham
Springer International Publishing
|
Series | Lecture Notes in Computer Science |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Product images and user reviews are two types of important side information to improve recommender systems. Product images capture users’ appearance preference, while user reviews reflect customers’ opinions on product properties that might not be directly visible. They can complement each other to jointly improve the recommendation accuracy. In this paper, we present a novel collaborative neural model for rating prediction by jointly utilizing user reviews and product images. First, product images are leveraged to enhance the item representation. Furthermore, in order to utilize user reviews, we couple the processes of rating prediction and review generation via a deep neural network. Similar to the multi-task learning, the extracted hidden features from the neural network are shared to predict the rating using the softmax function and generate the review content using LSTM-based model respectively. To our knowledge, it is the first time that both product images and user reviews are jointly utilized in a unified neural network model for rating prediction, which can combine the benefits from both kinds of information. Extensive experiments on four real-world datasets demonstrate the superiority of our proposed model over several competitive baselines. |
---|---|
ISBN: | 3319701444 9783319701448 |
ISSN: | 0302-9743 1611-3349 |
DOI: | 10.1007/978-3-319-70145-5_8 |