Exploring Factual Entailment with NLI: A News Media Study
We explore the relationship between factuality and Natural Language Inference (NLI) by introducing FactRel -- a novel annotation scheme that models \textit{factual} rather than \textit{textual} entailment, and use it to annotate a dataset of naturally occurring sentences from news articles. Our anal...
Saved in:
Main Authors | , |
---|---|
Format | Journal Article |
Language | English |
Published |
24.06.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We explore the relationship between factuality and Natural Language Inference
(NLI) by introducing FactRel -- a novel annotation scheme that models
\textit{factual} rather than \textit{textual} entailment, and use it to
annotate a dataset of naturally occurring sentences from news articles. Our
analysis shows that 84\% of factually supporting pairs and 63\% of factually
undermining pairs do not amount to NLI entailment or contradiction,
respectively, suggesting that factual relationships are more apt for analyzing
media discourse. We experiment with models for pairwise classification on the
new dataset, and find that in some cases, generating synthetic data with GPT-4
on the basis of the annotated dataset can improve performance. Surprisingly,
few-shot learning with GPT-4 yields strong results on par with medium LMs
(DeBERTa) trained on the labelled dataset. We hypothesize that these results
indicate the fundamental dependence of this task on both world knowledge and
advanced reasoning abilities. |
---|---|
DOI: | 10.48550/arxiv.2406.16842 |