DialFill: Utilizing Dialogue Filling to Integrate Retrieved Knowledge in Responses

In knowledge-based dialogue systems, generating responses that are both contextually relevant and factually accurate requires efficient and precise integration of external knowledge. Pre-trained language models (LM-based) leverage extensive general knowledge but often struggle with accuracy in domai...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 13; pp. 61123 - 61135
Main Authors Xue, Qiang, Takiguchi, Tetsuya, Ariki, Yasuo
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.01.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN2169-3536
2169-3536
DOI10.1109/ACCESS.2025.3555650

Cover

More Information
Summary:In knowledge-based dialogue systems, generating responses that are both contextually relevant and factually accurate requires efficient and precise integration of external knowledge. Pre-trained language models (LM-based) leverage extensive general knowledge but often struggle with accuracy in domain-specific or time-sensitive contexts due to their reliance on implicit knowledge. Conversely, knowledge-based approaches (KB-based) retrieve relevant information from external sources before response generation, yet they frequently fail to incorporate the retrieved content effectively, leading to responses that may omit critical information. To address these limitations, we propose DialFill, a novel response generation framework that reframes dialogue generation as a Dialogue Filling task. DialFill constructs an intermediate masked response that explicitly integrates the retrieved knowledge, subsequently predicting the missing components to ensure the final response incorporates all relevant information seamlessly. We validate DialFill on both unstructured (Wizard-of-Wikipedia) and structured (OpenDialKG) knowledge benchmarks, demonstrating competitive performance against state-of-the-art methods. In large language model experiments, DialFill significantly reduces the rate of retrieved content that is ignored, decreasing the number of ignored knowledge instances from 17.8% to 0.2%. These results highlight DialFill's potential to enhance the accuracy, reliability, and adaptability of knowledge-based dialogue systems, marking a significant advancement in the field.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2025.3555650