Learning to Improve Affinity Ranking for Diversity Search

Search diversification plays an important role in modern search engine, especially when user-issued queries are ambiguous and the top ranked results are redundant. Some diversity search approaches have been proposed for reducing the information redundancy of the retrieved results, while do not consi...

Full description

Saved in:
Bibliographic Details
Published inInformation Retrieval Technology Vol. 9994; pp. 335 - 341
Main Authors Wu, Yue, Li, Jingfei, Zhang, Peng, Song, Dawei
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2016
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783319480503
3319480502
ISSN0302-9743
1611-3349
DOI10.1007/978-3-319-48051-0_28

Cover

Loading…
More Information
Summary:Search diversification plays an important role in modern search engine, especially when user-issued queries are ambiguous and the top ranked results are redundant. Some diversity search approaches have been proposed for reducing the information redundancy of the retrieved results, while do not consider the topic coverage maximization. To solve this problem, the Affinity ranking model has been developed aiming at maximizing the topic coverage meanwhile reducing the information redundancy. However, the original model does not involve a learning algorithm for parameter tuning, thus limits the performance optimization. In order to further improve the diversity performance of Affinity ranking model, inspired by its ranking principle, we propose a learning approach based on the learning-to-rank framework. Our learning model not only considers the topic coverage maximization and redundancy reduction by formalizing a series of features, but also optimizes the diversity metric by extending a well-known learning-to-rank algorithm LambdaMART. Comparative experiments have been conducted on TREC diversity tracks, which show the effectiveness of our model.
ISBN:9783319480503
3319480502
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-48051-0_28