A proactive decision support system for reviewer recommendation in academia

Peer review is an essential part of scientific communications to ensure the quality of publications and a healthy scientific evaluation process. Assigning appropriate reviewers poses a great challenge for program chairs and journal editors for many reasons, including relevance, fair judgment, no con...

Full description

Saved in:
Bibliographic Details
Published inExpert systems with applications Vol. 169; p. 114331
Main Authors Pradhan, Tribikram, Sahoo, Suchit, Singh, Utkarsh, Pal, Sukomal
Format Journal Article
LanguageEnglish
Published New York Elsevier Ltd 01.05.2021
Elsevier BV
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Peer review is an essential part of scientific communications to ensure the quality of publications and a healthy scientific evaluation process. Assigning appropriate reviewers poses a great challenge for program chairs and journal editors for many reasons, including relevance, fair judgment, no conflict of interest, and qualified reviewers in terms of scientific impact. With a steady increase in the number of research domains, scholarly venues, researchers, and papers in academia, manually selecting and accessing adequate reviewers is becoming a tedious and time-consuming task. Traditional approaches for reviewer selection mainly focus on the matching of research relevance by keywords or disciplines. However, in real-world systems, various factors are often needed to be considered. Therefore, we propose a multilayered approach integrating Topic Network, Citation Network, and Reviewer Network into a reviewer Recommender System (TCRRec). We explore various aspects, including relevance between reviewer candidates and submission, authority, expertise, diversity, and conflict of interest and integrate them into the proposed framework TCRRec. The paper also addresses cold start issues for researchers having unique areas of interest or for isolated researchers. Experiments based on the NIPS and AMiner dataset demonstrate that the proposed TCRRec outperforms state-of-the-art recommendation techniques in terms of standard metrics of precision@k, MRR, nDCG@k, authority, expertise, diversity, and coverage. •Relevance, authority, expertise, diversity, and conflict of interest are considered.•Experiments on NIPS and AMiner dataset show effectiveness of proposed system.•The temporal changes of reviewers’ interest are incorporated.•Evaluated in terms of precision, nDCG, MRR, authority and expertise.•Investigated tradeoff between diversity, and coverage vs. precision.
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2020.114331