Greatmeta: gradient-aware adaptive meta-learning for cold-start recommendations

In recent years, optimizer-based meta-learning, i.e., model-agnostic meta-learning (MAML), has emerged as a powerful tool to tackle the cold-start problem in recommender systems. However, existing methods overlook the key fact that user preferences are inherently imbalanced in real-world application...

Full description

Saved in:
Bibliographic Details
Published inData mining and knowledge discovery Vol. 39; no. 5; p. 42
Main Authors Du, Yantong, Chen, Rui, Han, Qilong, Tan, Qiaoyu, Song, Hongtao, Zhang, Chi
Format Journal Article
LanguageEnglish
Published New York Springer US 01.09.2025
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1384-5810
1573-756X
DOI10.1007/s10618-025-01123-5

Cover

Loading…
More Information
Summary:In recent years, optimizer-based meta-learning, i.e., model-agnostic meta-learning (MAML), has emerged as a powerful tool to tackle the cold-start problem in recommender systems. However, existing methods overlook the key fact that user preferences are inherently imbalanced in real-world applications. Learning indiscriminately from imbalanced user preferences may lead to an imbalanced meta-model that introduces systematic bias in subsequent recommendations. In this paper, we explore the impact of imbalanced user preferences on the meta-training process and consequently propose a novel Gradient-aware adaptive Meta-learning (GreatMeta) model for cold-start recommendations. Inspired by prior work, our key idea is to leverage gradient signals to understand a meta-model’s status and guide the learning process. More specifically, we put forward three original gradient-aware factors to measure the state of the meta-model and the level of user imbalance so that we can make use of the correlation between the meta-model and the underlying users to generate a balanced meta-model. Based on these factors, we design a preference-aware scheduler to adaptively adjust each user’s contribution in the meta-training process, which helps the resultant meta-model better generalize to minority cold-start users. We also introduce a personalized encoder that effectively utilizes limited data and accelerates the adaptation process of the balanced meta-model. Moreover, we theoretically justify the rationality of the three proposed gradient-aware factors. Extensive experimental results on public benchmark datasets demonstrate the superiority of GreatMeta over a large number of state-of-the-art recommendation methods, confirming the value of addressing the imbalance of user preferences in MAML-based meta-learning for cold-start recommendations. The code is available at https://github.com/YantongDU/GreatMeta .
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1384-5810
1573-756X
DOI:10.1007/s10618-025-01123-5