Search-engine-augmented dialogue response generation with cheaply supervised query production

Knowledge-aided dialogue response generation aims at augmenting chatbots with relevant external knowledge in the hope of generating more informative responses. The majority of previous work assumes that the relevant knowledge is given as input or retrieved from a static pool of knowledge. However, t...

Full description

Saved in:
Bibliographic Details
Published inArtificial intelligence Vol. 319; p. 103874
Main Authors Wang, Ante, Song, Linfeng, Liu, Qi, Mi, Haitao, Wang, Longyue, Tu, Zhaopeng, Su, Jinsong, Yu, Dong
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.06.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Knowledge-aided dialogue response generation aims at augmenting chatbots with relevant external knowledge in the hope of generating more informative responses. The majority of previous work assumes that the relevant knowledge is given as input or retrieved from a static pool of knowledge. However, this assumption violates the real-world situation, where knowledge is continually updated and a chatbot has to dynamically retrieve useful knowledge. We propose a dialogue model that can access the vast and dynamic information from any search engine for response generation. As the core module, a query producer is used to generate queries from a dialogue context to interact with a search engine. We design a training algorithm using cheap noisy supervision for the query producer, where the signals are obtained by comparing retrieved articles with the next dialogue response. As the result, the query producer is adjusted without any human annotation of gold queries, making it easily transferable to other domains and search engines. Experiments show that our query producer can achieve R@1 and R@5 rates of 62.4% and 74.8% for retrieving gold knowledge, and the overall model generates better responses over strong knowledge-aided baselines using BART [1] and other typical systems.
ISSN:0004-3702
1872-7921
DOI:10.1016/j.artint.2023.103874