Generating Textual Entailment Using Residual LSTMs
Generating textual entailment (GTE) is a recently proposed task to study how to infer a sentence from a given premise. Current sequence-to-sequence GTE models are prone to produce invalid sentences when facing with complex enough premises. Moreover, the lack of appropriate evaluation criteria hinder...
Saved in:
Published in | Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data Vol. 10565; pp. 263 - 272 |
---|---|
Main Authors | , , , |
Format | Book Chapter |
Language | English |
Published |
Switzerland
Springer International Publishing AG
2017
Springer International Publishing |
Series | Lecture Notes in Computer Science |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Be the first to leave a comment!