Optimizing answer selection in community question answering through pre-trained and large language models
Community Question Answering (CQA) has become increasingly prevalent in recent years. However, the large volume of answers poses a challenge for users in identifying the most pertinent ones, thereby making answer selection a vital subtask within CQA. In this paper, we introduce the Question-Answer c...
Saved in:
Main Author | |
---|---|
Format | Conference Proceeding |
Language | English |
Published |
SPIE
30.09.2024
|
Online Access | Get full text |
Cover
Loading…
Summary: | Community Question Answering (CQA) has become increasingly prevalent in recent years. However, the large volume of answers poses a challenge for users in identifying the most pertinent ones, thereby making answer selection a vital subtask within CQA. In this paper, we introduce the Question-Answer cross attention networks (QAN) with pre-trained models for improved answer selection and leverage large language models (LLMs) for enhanced answer selection with knowledge augmentation. Specifically, we utilize the BERT model as the encoder layer to pre-train on question subjects, question bodies, and answers separately. The cross attention mechanism is then used to identify the most relevant answers for different questions. Our experimental results demonstrate that the QAN model attains state-of-the-art performance on the SemEval2015 and SemEval2017 datasets. Additionally, we use LLMs to generate external knowledge from questions and correct answers, enhancing the answer selection task. By optimizing the LLM prompts in various aspects, we found that incorporating external knowledge improves the correct answer selection rate on both the SemEval2015 and SemEval2017 datasets. Moreover, optimized prompts enable LLMs to select the correct answers for a greater number of questions. |
---|---|
Bibliography: | Conference Location: Guangzhou, China Conference Date: 2024-05-17|2024-05-19 |
ISBN: | 9781510683143 1510683143 |
ISSN: | 0277-786X |
DOI: | 10.1117/12.3045190 |