A Tensor Neural Network with Layerwise Pretraining: Towards Effective Answer Retrieval

In this paper we address the answer retrieval problem in community-based question answering. To fully capture the interactions between question-answer pairs, we propose an original tensor neural network to model the relevance between them. The question and candidate answers are separately embedded i...

Full description

Saved in:
Bibliographic Details
Published inJournal of computer science and technology Vol. 31; no. 6; pp. 1151 - 1160
Main Authors Bao, Xin-Qi, Wu, Yun-Fang
Format Journal Article
LanguageEnglish
Published New York Springer US 01.11.2016
Springer Nature B.V
Key Laboratory of Computational Linguistics, Peking University, Beijing 100871, China
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper we address the answer retrieval problem in community-based question answering. To fully capture the interactions between question-answer pairs, we propose an original tensor neural network to model the relevance between them. The question and candidate answers are separately embedded into different latent semantic spaces, and a 3-way tensor is then utilized to model the interactions between latent semantics. To initialize the network layers properly, we propose a novel algorithm called denoising tensor autoencoder (DTAE), and then implement a layerwise pretraining strategy using denoising autoencoders (DAE) on word embedding layers and DTAE on the tensor layer. The experimental results show that our tensor neural network outperforms various baselines with other competitive neural network methods, and our pretraining DTAE strategy improves the system's performance and robustness.
Bibliography:artificial intelligence, language parsing and understanding, machine learning
In this paper we address the answer retrieval problem in community-based question answering. To fully capture the interactions between question-answer pairs, we propose an original tensor neural network to model the relevance between them. The question and candidate answers are separately embedded into different latent semantic spaces, and a 3-way tensor is then utilized to model the interactions between latent semantics. To initialize the network layers properly, we propose a novel algorithm called denoising tensor autoencoder (DTAE), and then implement a layerwise pretraining strategy using denoising autoencoders (DAE) on word embedding layers and DTAE on the tensor layer. The experimental results show that our tensor neural network outperforms various baselines with other competitive neural network methods, and our pretraining DTAE strategy improves the system's performance and robustness.
11-2296/TP
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1000-9000
1860-4749
DOI:10.1007/s11390-016-1689-4