Generating Textual Entailment Using Residual LSTMs

Generating textual entailment (GTE) is a recently proposed task to study how to infer a sentence from a given premise. Current sequence-to-sequence GTE models are prone to produce invalid sentences when facing with complex enough premises. Moreover, the lack of appropriate evaluation criteria hinder...

Full description

Saved in:
Bibliographic Details
Published inChinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data Vol. 10565; pp. 263 - 272
Main Authors Guo, Maosheng, Zhang, Yu, Zhao, Dezhi, Liu, Ting
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2017
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…